A National Renewal Energy Laboratory "irradiance" map, which shows the available sunlight around the world, suggests that the United States is the Saudi Arabia of solar energy, with twice the irradiance of Europe. It's a picture worth a thousand words to solar activists looking to make a convincing case for the emerging energy source.
Harnessing global sun power "is just an engineering effort," said Werner Koldehoff of the German Solar Industry Association.
Koldehoff's group and other European solar enthusiasts have come to America to make the case for solar thermal technology, an alternative to photovoltaics (PV) that attempts to harness the efficient phase change from water to steam. For cost and technology reasons, solar thermal is emerging as the preferred alternative energy technology in the race to replace fossil fuels with sustainable energy sources, many experts agree.
Solar benefits
Along with cost per watt—eventually cost per kilowatt—solar thermal's biggest selling point is its ability to store energy and deliver electricity to consumers during periods of peak power demand. Experts at a recent solar energy conference said "concentrating" solar thermal power could allow utilities and other emerging operators to store steam energy for up to six hours. Super-heated steam is used to drive turbines that generate electricity.
For residential, business and other lower-temperature applications, solar thermal could be used to heat water as well as for space heating. Koldehoff said the approach could also be harnessed for an emerging application he calls "solar-assisted cooling." Air conditioning requires roughly 4.5x the energy as heating. The largest amount of solar energy is available in the late afternoon during peak demand for air conditioning. Hence, advocates say, solar thermal power offers the least-expensive source of electricity when demand is highest.
Koldehoff said pilot solar-cooling projects are already under way in sunny Spain, and the technique could also be used for applications such as operating power-hungry desalinization plants. "The real future application in the next five years is [solar] cooling, and we need it badly, because we can't afford [the soaring cost of cooling] anymore," he said.
Concentrating, or sun-tracking, photovoltaics and solar thermal power collectors such as parabolic troughs follow the sun across the sky at one of more axis points, focusing sunlight by as much as 1,500-fold in high-end systems to improve the efficiency of solar panels.
Power concentration
Experts note that solar thermal's so-called "dispatchability" means stored power could reliably generate electricity that could then be sold to utilities during load peaks on electric grids, usually after 5 p.m. This "load-shifting" approach makes solar thermal power far more valuable for plant operators than, say, photovoltaic energy that must be used immediately.
"Thermal energy storage is the killer app of concentrating solar power technology," Andrew McMahan, VP for technology and projects at SkyFuel, told a packed solar technology conference last month held in conjunction with Semicon West. Solar thermal collector technologies like parabolic troughs have a good track record after more than 20 years of use, McMahan added. "The technology has steadily improved and is being demanded by utilities" when negotiating power supply agreements with solar operators.
Industry analysts like Jim Hines, Gartner Inc. research director for semiconductors and solar, agree that solar thermal appears best suited to large power projects aimed at supplying electricity to utilities. Other technologies, such as flat-plate photovoltaics and concentrating PV systems, work best in residential and commercial applications, Hines said. Photovoltaic "cost projections are encouraging, but future demand will depend on external factors" like solar thermal becoming the technology of choice.
Among the large solar thermal projects discussed at the solar confab were several "power tower" projects that use concentrating solar collectors to refocus sunlight on "solar boilers." For example, solar developer Brightsource Energy is building a 400MW solar thermal complex in California's Mojave Desert, a prime location for a number of planned solar thermal projects. Along with other industry executives, Brightsource CEO John Woolard noted that the primary challenge for solar thermal is efficiently transmitting power from remote desert locations to cities.
"Solar has become an important part of our resource mix," said Hal LaFlash, director of resource planning for Pacific Gas & Electric. "The big challenge is transmission [because] the highest resource potential is remote from population centers."
'Fossil-assisted solar'
Still, experts agreed that for large alternative-energy projects, solar thermal appears to be the best approach. According to estimates compiled by the Prometheus Institute for Sustainable Development, solar thermal power- generating costs could drop from about $4.25/W in 2008 to $2.5/W by 2020.
Solar thermal "is an extremely cost-effective technology compared with other [solar] technologies," though costs may not drop as fast as competing technologies like flat-plate or concentrating photovoltaics, said Travis Bradford, founder of the Prometheus Institute.
Nevertheless, solar thermal's "load-shifting" capability allows producers to store electricity, and then sell it during periods of peak demand. The predictability of solar thermal power along with technology innovations could help drive down start-up costs as the solar power infrastructure is built. Proponents add that the amount of energy needed to build and deploy solar thermal technologies is recovered in less than a year, more than twice as fast as comparable photovoltaic systems.
For now, advocates envision an energy future where solar energy supplements current fossil fuels. But as a sustainable energy infrastructure is built and solar technologies become more reliable and affordable, solar boosters like the German activist Werner Koldehoff are talking about a future in which dwindling fossil fuels are used to back up abundant solar power.
The earth's energy future hinges on "fossil-assisted solar," said Koldehoff. "We have a responsibility to future generations to make this happen."
- George Leopold
EE Times
Thursday, August 21, 2008
Solar advocates beef up solar thermal efforts
CDMA is not dead, notes ABI
Although the overall dynamics of CDMA markets are overshadowed by the hype around UMTS/HSPA and the migration to LTE, CDMA operators continue to upgrade their networks to provide capacity for higher numbers of bandwidth-intensive data services, as well as escalating traffic load. This is according to ABI Research's report "Mobile Subscriber Market Data".
"Global EVDO Rev A subscriber numbers ramped up more than eightfold between Q2 07 and Q2 08," says ABI analyst Khor Hwai Lin. "The United States and South Korean markets show the highest growth rate for EVDO Rev A. The increased support for LTE from incumbent CDMA operators does not imply the imminent death of EVDO Rev A and B, because LTE is addressing different market needs compared to 3G."
EVDO Rev A subscribers will exceed 54 million by 2013 while Rev B subscribers will reach 25 million, reports ABI.
Over 31 million subscribers worldwide are already using HSDPA while 3.2 million subscribers were on HSUPA networks by Q2 08. Upgrades to HSUPA continue to take place aggressively around Western Europe and the Asia Pacific. Hence, HSUPA subscribers are estimated to hit 139 million by 2013.
"HSPA+ will contest with LTE and mobile WiMAX in the mobile broadband space," adds Asia-Pacific VP Jake Saunders. The 100Mbit/s download data rate difference between LTE (20MHz) and HSPA+ may not attract mid-tier operators to migrate, as LTE is based on OFDM technology that requires new components, while a move to HSPA+ is perceived to be more gradual transition."
Due to the large number of GSM 900 subscribers and the high possibility of refarming the spectrum for UMTS, ABI estimates that the majority of these global subscribers (about 1.2 billion by 2013) will be on 900MHz-only band. In second place would be dual-band users on 900MHz and 1,800MHz (1 billion by 2013). Subscribers of 2100MHz will ramp up steadily with a CAGR of 23.5 percent between 2007 and 2013.
Samsung leads NAND market ranking in Q2
The NAND flash memory market remained constant in Q2, but Samsung Electronics Co. Ltd's ranking stood as the only profitable supplier during the period, according to iSuppli Corp.
Meanwhile, Intel Corp. is trailing near the back of the pack in the rankings. Samsung stayed at the top post in the NAND share in Q2, followed in order by Toshiba, Hynix Semiconductor Inc., Micron Technology Inc., Intel, Numonyx and Renesas Technology, according to iSuppli.
Global NAND flash memory revenue declined to $3.36 billion in Q2, down 2.5 percent from $3.45 billion in Q1, according to the research firm. "Five of the top seven NAND suppliers had either declines or zero revenue growth during the period," it added.
The NAND flash market has slid sharply at the start of 2008. iSuppli has reduced its 2008 NAND annual flash revenue growth forecast from 9 percent to virtually zero.
Hitting the mark
"Based on the recent rankings, Samsung maintained its number 1 position in the NAND market with 42.3 percent market share recorded during Q2 08," said iSuppli. In the quarter, it added Samsung's NAND sales hit $1.422 billion, down 1.9 percent from the first period. (See rankings table)
Toshiba Corp. holds the second position in the NAND market ranking with 27.5 percent market share recorded during Q2 08, it noted. "During the quarter, Toshiba's NAND sales hit $925 million, down 1.8 percent from the first period," said iSuppli.
"Hynix takes the third spot in the NAND market with 13.4 percent market share recorded during Q2 08," said iSuppli. "Its NAND sales hit $450 million, down 13.1 percent from the first period," it added.
"Micron holds fourth place in the NAND market with 8.9 percent market share recorded during the Q2 08, with NAND sales of $300 million, up 11.9 percent from the first period," it noted.
The fifth post belongs to Intel Corp. in the NAND market with 5.2 percent market share having sales of $174 million, up 4.8 percent from the first period.
iSuppli said Micron narrowed its NAND flash memory market-share gap with Hynix, setting the stage for a battle for the industry's third rank this year.
The research firm believes that as Hynix is seen to concentrate mainly on DRAM throughout 2008, its NAND market share gap with Hynix will be lesser.
''In the memory world, process migrations and wafer scale are two crucial factors in driving down costs,'' said Nam Hyung Kim, analyst, iSuppli, in a statement.
"Micron has rapidly increased its 300mm wafer capacity and the 32nm geometry will boost its profitability in the near future, as well as its productivity. Because of its aggressive production ramp, Micron challenges Hynix in the market's third spot. By 1H 09, it is likely to compete with industry leaders Samsung and Toshiba based on profitability,'' he added.
The DRAM rankings did not change in Q2 08, but Elpida Memory Inc. is gaining ground on Hynix. Amid a major downturn in the sector, Samsung was still in first place in the DRAM rankings for Q2, followed in order by Hynix, Elpida, Micron, Qimonda AG, Powerchip Semiconductor Corp., Nanya Technology Corp., Promos, Etron Technology Inc. and Winbond Electronics Corp., according to iSuppli.
- Mark LaPedus
EE Times
Green trends heat up for next designs
Gone are the days when design was, well, design. Today it's design-for-manufacturability (DFM), design-for-quality, design-for-cost and design-for-environment (DfE).
DfE takes into account the environmental impact of a product from the time of its inception to the end of life, and then back into the resource pool for future products, typically referred to as cradle-to-cradle. "It's a radical departure from the status quo," said Pamela Gordon, lead consultant, Technology Forecasters Inc. (TFI).
"Over the past 50 years, we've moved to a disposable mentality for electronics. The benefits were quick and easy access to new technologies, but we had a buildup of electronic waste. DfE makes the product useful for many years," she added.
Maximizing use
When observing a DfE lens, virtually every aspect of a product is affected, like the size, weight and energy requirements. An important question often asked is, "Are there opportunities for reducing the number of components and consolidating components?" This could save real estate, trim the BOMs and the suppliers' count.
The types of materials for both the product and the packaging are essential. Redesigning the product for ease of disassembly will enable the reusable parts to be taken away at end of life. For those parts that can't be reused, the design has to maximize those that are recyclable to reduce the wastes going to landfill.
Once the product is designed, there are supply chains and logistics issues to consider, such as determining the manufacturing location to reduce the cost and carbon footprint. Another factor is how many miles all the components have to travel before they come together in the final product at the customer's location. One top-tier electronics OEM estimates that the carbon footprint of its supply chain is 20 times that of its own operations.
Seems like a lot to consider? It is, but virtually none of the DfE considerations are inconsistent with the cost or quality requirements of design. In fact, they can contribute in a positive way to both cost and quality.
The Xerox experience
Consider Xerox Corp. that has had a formal environmental commitment since 1991. By applying the principles of DfE to the design of the iGen3, a commercial printing system, the company dramatically improved the environmental impact of the product. More than 90 percent of the parts and subsystems within the machine are either recyclable or can be manufactured again. Eighty percent of the waste produced by the iGen3 is reusable, recyclable or returnable.
Besides Xerox, many other top-tier OEMs are engaged in DfE in one form or another such as Hewlett-Packard, Apple Inc., IBM Corp. and Intel Corp. that all have DfE programs. Among midtier and smaller companies, the rate of adoption has been slow, some saying they are nonexistent. "I don't see companies dealing seriously with DfE," said Michael Kirschner, president, Design Chain Associates, a design-consulting firm based in San Francisco. "There's no real incentive outside of the fear of Greenpeace," he added.
EU compliance
Gordon said: "Most electronics companies have only gone as far as compliance with the European Union's environmental directives on ROHS and WEEE. "DfE is like the quality movement of the 1980s. Those who are slow to embrace the trend did less favorably than those that figured out it produced financial benefit," she added.
There are a few exceptions, however. One is Blue Coat Systems, a high-growth maker of appliance-based solutions that enable IT organizations to optimize security and accelerate performance between users and applications across the enterprise WAN. A year ago, the company gathered a group of hardware engineers and product managers in a room for a DfE workshop and asked them to disassemble some products like Blue Coat products, competitors' products and a benchmark product that had applied DfE principles.
"The exercise was eye opener. We were surprised by how well Blue Coat products measured up to the benchmark product, even though we had not consciously designed for ease of recyclability," said David Cox, VP of operations, Blue Coat. "The exercise made us realize that we had a great opportunity to integrate DfE into the next generation of our product," he added.
Blue Coat created a cross-functional team to explore opportunities for DfE in a next-generation product. One design change they made was in power supplies. The current generation was employing a power supply that was less than 80 percent efficient. Blue Coat set a goal of more than 90 percent efficiency.
Size does matter
The designers chose an open-frame power supply that was smaller and 50 percent lighter, had better heat dissipation and consumed less energy. Because of its smaller size, more units can be packaged in a container, which means lower CO2 emissions per unit during transport. For the next fiscal year, Blue Coat is focusing on environmental initiatives that will save the company in excess of $1 million. "That's a conservative estimate," said Cox. "Employees really care about this," he added. "The beauty of it is that it's a no-brainer. You save the company's money to improve the customer experience and help the environment," he stressed.
- Bruce Rayner
EE Times
Intel's Canmore connects TV, CE devices to Internet
At the Intel Developer Forum, Intel Corp. introduced the Intel Media Processor CE 3100, the first in a new family of purpose-built SoCs for consumer electronics (CE) devices based on the company's popular Intel architecture (IA) blueprint.
Executives also provided updates on the mobile internet device (MID) category and Intel Atom processor, unveiled a brand with DreamWorks Animation SKG Inc. around the shift to 3D movie-making and outlined a number of efforts to speed many-core processor software design.
The CE 3100 has been developed for Internet-connected CE products such as optical media players, connected CE devices, advanced cable STBs and DTVs. The media processor (previously codenamed "Canmore") combines CE features for high-definition video support, home-theater quality audio and advanced 3D graphics, with the performance, flexibility and compatibility of IA-based hardware and software.
Intel expects to begin shipments of this product next month.
Intel and its customers have been working together to develop a variety of products for emerging growth areas—consumer electronics, MIDs, netbooks and embedded computers—each based on Intel architecture that enables uncompromised Internet access.
"As consumers look to stay connected and entertained regardless of where they are and what device they are using, the Web continues to affect our lives in new ways and is quickly moving to the TV thanks to a new generation of Internet-connected CE devices," said Eric Kim, Intel senior VP and general manager of digital home group. "As Intel delivers its first IA SoC with performance and Internet compatibility for CE devices, we are providing a powerful and flexible technology foundation upon which the industry can quickly innovate upon. This technology foundation will help the high-tech industry bring devices to market faster, as well as encourage new designs and inspire new services, such as connecting the TV to the Internet."
Extending IA into consumer electronics
As another SoC product from Intel, the Intel Media Processor CE 3100 is a highly integrated solution that pairs a powerful IA processor core with leading-edge multistream video decoding and processing hardware. It also adds a 3-channel 800MHz DDR2 memory controller, dedicated multichannel dual audio DSPs, a 3D graphics engine enabling advanced UIs and EPGs, and support for multiple peripherals, including USB 2.0 and PCIe.
The Intel Media Processor CE 3100 also features Intel Media Play Technology that combines hardware-based decoding for broadcast TV and optical media playback with software-based decode for Internet content. When a consumer watches broadcast TV or content on optical media players, the video is encoded in standard formats, such as MPEG-2, H.264 or VC-1. Intel Media Play Technology software routes the video to the on-chip hardware decoders. When viewing Internet content, the software automatically routes the video, and audio as applicable, to a software codec running on the IA processor core. As the Internet becomes more omnipresent, the ability to decode multiple video and audio formats will provide the industry with greater flexibility to evolving standards and technologies, and consumers with more viewing experiences.
The Intel Media Processor CE 3100 is scheduled to ship to CE manufacturers.
Additionally, Intel announced the next generation of parallel programming tools that offer new options for multi-core software development for mainstream client applications. The Intel Parallel Studio includes expanded capabilities for helping design, code, debug and tune applications to harness the power of multicore processing through parallel programming. Intel Parallel Studio will ease the path for parallel application development to deliver performance and forward scaling to many-core processors for Microsoft Visual Studio developers.
Saturday, August 2, 2008
Media drives 12% increase in home nets
A new report from IMS Research forecasts that WLANs, xDSL access networks and DSPs will continue to dominate home networking and sees an estimated 12 percent growth in unit sales over the current five-year period.
"The opportunity is for those who can cash in on rising interest in whole home multimedia networks for video and voice," said Tom Hackenberg, an embedded processing research analyst with IMS. "Home data networks are beginning to mature, but multimedia capable whole home networks are still very much an emerging market," he said.
The market for access and LAN devices in the home will rise 11.9 percent on a compound annual basis from 256 million units in 2007 to about 450 million units in 2012, according to a new report from IMS. However, revenues will increase at a statelier 7.6 percent pace from about $10 billion in 2007 to $14.5 billion in 2012, as product prices decline.
"Interestingly security and home automation products will piggyback on the emerging media networks," Hackenberg added.
LAN devices such as routers, bridges and interface cards will see the fastest growth, rising some 17.6 percent on a unit basis over the period to about 265 million units, according to IMS. The percentage of those devices based around wireless nets such as Wi-Fi will grow from about 65 percent today to about 70 percent by 2012, the report projects.
"There also will be significant growth in hybrid wireless/wired home networks over the next five years," said Hackenberg.
Wireless Ethernet links currently make up the second largest number of connections with some 30 million links deployed, but the group is only growing about five percent on a compound annual basis. By contrast, powerline, phoneline and coax links are growing at rates ranging from 28 to 50 percent.
Worldwide, xDSL technologies continue to be the home access net of choice. About 71 percent of home access networks were based on some form of xDSL in 2007, a slice that will decline just slightly to 64 percent by 2012, according to the report.
Under the covers, DSPs will continue to dominate other processor types as the most prevalent in home networking systems. About 500 million DSPs shipped into home net systems in 2007, a figure that will grow to more than 825 million by 2012, IMS predicts. The next two largest categories in digital silicon for home networks are 4- to 8bit microprocessors and ASSPs, roughly tied at a little less than 400 million units each, shipping into home net systems by 2012.
"DSPs will continue to be the cheapest alternative for signal processing jobs that will become increasingly important as home nets move to carrying more voice and video traffic," Hackenberg said.
Home networking systems are undergoing a transition from hard-coded MCUs to low end microprocessors. That's because designers need more performance and flexibility to deal with nets that increasingly sport more bandwidth to link to a growing number of devices, he added.
- Rick Merritt
EE Times
Siemens finds JV partner for enterprise comms biz
Siemens has announced that The Gores Group will acquire a 51 percent stake in its enterprise communications business Siemens Enterprise Communications (SEN). Siemens will retain a stake of 49 percent.
"We have been looking for an opportunity to expand our presence in the enterprise networking and communications space and this partnership with Siemens provides the perfect fit," noted Alec Gores, founder and chairman of Gores.
"We are continuing to intensify the focus of our portfolio on the three sectors, which are energy, industry and healthcare. In Gores, we have found an extremely experienced technology and telecommunications partner, who strengthens the business with the contribution of the two assets Enterasys and SER Solutions," said Joe Kaeser, Siemens chief financial officer. The deal of the joint venture is expected to close at the end of Siemens fiscal year 2008, pending regulatory approval.
Gores and Siemens plan to invest approximately 350 million euros ($544 million) in the joint venture—not including expenditures for R&D and other expenditures as part of the ordinary course of business. The investments will be made in order to launch innovative SEN products on the market, acquire other technology platforms to capitalize on the powerful SEN distribution organization and further drive the expansion and transition of the business from a hardware supplier to a software and service provider.
When the joint venture is launched, SEN business will also be supplemented and strengthened by combining the business with two of Gores' current portfolio companies—Enterasys, a network equipment and security solutions provider and SER Solutions, a call center software company. "Combining the three companies will lead to a more complete enterprise networking and communications offering that will leverage SEN powerful distribution capabilities, global reach and extensive customer base," stated Alec Gores.
On an operational level, business will be driven by Gores but the JV company will be entitled to continue using the Siemens brand. Key patents and licenses will be transferred to the joint venture. Production facilities in Leipzig, Germany, Curitiba, Brazil, Thessaloniki, Greece, will be transferred to the joint venture.
Three giants collaborate on cloud computing
HP, Intel Corp. and Yahoo Inc. have announced the creation of a global, multidata center, open source test bed for the advancement of cloud computing research and education. This initiative is set to highlight collaboration among key personnel in the industry, academia and governments by getting rid of barriers toward rigid research in data-intensive, Internet-scale computing.
The cloud computing test bed of these three giants intends to provide a globally distributed, Internet-scale testing environment that seeks to encourage research on the software, datacenter management and hardware issues associated with cloud computing at a larger scale. This includes strengthening the support research groups of cloud applications and services.
Diverse connection
HP, Intel and Yahoo! have partnered with the Infocomm Development Authority of Singapore (IDA), the University of Illinois at Urbana-Champaign, and the Karlsruhe Institute of Technology in Germany to create a dynamic research initiative. This includes the National Science Foundation as part of the list of reliable partners.
The test bed will initially consist of six "centers of excellence" at IDA facilities, the University of Illinois at Urbana-Champaign, the Steinbuch Centre for Computing of the Karlsruhe Institute of Technology, HP Labs, Intel Research and Yahoo. Each location will host a cloud computing infrastructure, largely based on HP hardware and Intel processors, and will have 1,000 to 4,000 processor cores capable of supporting the data-intensive research associated with cloud computing. The test bed locations are expected to be in full operation and accessible to the number of researchers worldwide through a selection process later this year.
Kick-off strategyThe test bed will seek Yahoo's technical leadership in open source projects by running Apache Hadoop, an open source, distributed computing project of the Apache Software Foundation and other open source, which distribute computing software such as Pig, the parallel programming language developed by Yahoo Research.
"The HP, Intel and Yahoo cloud computing test bed is extending our commitment to the global, collaborative research community, along with the advancement of new sciences in the Internet," said Prabhakar Raghavan, head of Yahoo Research. "This test bed will enable researchers to test applications at the Internet scale and provide them access to the underlying computing systems to advance their learning on how systems software and hardware function in a cloud environment," he added.
Specialized cloud services
Researchers at HP Labs will use the test bed to conduct advanced research in the areas of intelligent infrastructure and dynamic cloud services. HP Labs recently refocuses its strategy to help HP and its customers toward cloud computing, a driving force behind HP's vision of 'Everything as a Service.' With this vision, devices and services can interact through the cloud, and businesses and individuals will use services that cater to their needs based on location, preferences, calendar and communities.
"The technology industry must think about the cloud as a platform for creating new services and experiences. This requires an entirely new approach to the way we design, deploy and manage cloud infrastructure and services," said Prith Banerjee, senior VP of research, HP, and director, HP Labs. "The HP, Intel and Yahoo! Cloud Computing Test Bed will enable us to tap the brightest minds in the industry, as well as other related sectors to share their ideas in promoting innovation," he added.
Intel's participation
"We are willing to engage with the academic research community," said Andrew Chien, VP and director of Intel Research. "Creating large-scale test beds is essential to draw away barriers to innovation and encourage experimentation and learning at scale," he noted.
"With the ready and available Internet-scale resources in Singapore to support cloud computer research and development work, we can collaborate with like-minded partners to advance the field," said Khoong Hock Yun, assistant chief executive, infrastructure development group, Infocomm. "Cloud computing is the next paradigm shift in computer technology, and this may be the next 'platform' for innovative ecosystems. This will allow Singapore to leverage this new paradigm for greater economic and social growth," he added.
In November 2007, Yahoo announced the deployment of a supercomputing-class datacenter, called M45, for cloud computing research. Carnegie Mellon University was the first institution to take advantage of this supercomputer. Yahoo also said this year an agreement with Computational Research Laboratories to jointly support cloud-computing research and make one of the 10 fastest supercomputers in the world available to academic institutions in India.
High-performance innovations
In 2008, HP announced the formation of its Scalable Computing & Infrastructure Organization (SCI), which includes a dedicated set of resources that provide expertise and spearhead development efforts to build scalable solutions designed for high-performance and cloud computing customers. The company introduced scalable computing offerings including the Intel Xeon-based HP ProLiant BL2x220c G5, the first server blade to combine two independent servers in a single blade, and the HP StorageWorks 9100 Extreme Data Storage System (ExDS9100), a highly scalable storage system designed to simplify the management of multiple petabytes. HP also introduced the HP Performance-Optimized Datacenter, an open architecture, compact and shipped-to-order alternative for deploying IT resources.
Compact housing defines UHF FM transmitter module
Radiometrix's TX2S is a miniature PCB-mounted UHF radio transmitter for the 433.92MHz (UK) or 434.42MHz (European) radio bands.
Contained within a compact package, measuring 20mm x 10mm x 2mm, the TX2S allows design engineers to create a simple data link (either with a node-to-node, or multi-node architecture), which is capable of supporting rates of up to 40Kbit/s at distances of as much as 75m in-building and 300m across open ground.
The crystal-based PLL controlled FM transmitter operates off an input voltage of between 2.2 and 4V (allowing it to be used in portable, battery-powered system designs) and delivers nominally +0dBm at 7mA. The transmitter module is approved to the EN 300 220-3 and EN 301 489 standards. Internal filtering helps to ensure EMC levels are kept low by minimizing spurious radiation.
Key applications include car/building security systems, EPOS monitoring, inventory tracking, remote industrial process control, and computer networking. Price is about $15 each in quantities between 1 and 99.
- Ismini Scouras
eeProductCenter
Multithreading comes undone
EDA vendors have struggled to meet the challenge of multicore IC design by rolling out multithreading capabilities for their tools. Nonetheless, the question cannot be ignored: Is multithreading the best way to exploit multicore systems effectively?
"Threads are dead," asserted Gary Smith, founder and chief analyst for Gary Smith EDA. "It is a short-term solution to a long-term problem."
At the 45nm node, more and more designs reach and exceed the 100 million-gate mark. These designs break current IC CAD tools, forcing EDA vendors to develop products capable of parallel processing.
Until now, parallel processing has relied on threading. Threading, however, tends to show its limits at four processors, and EDA vendors may have to come up with new ways of attacking the problem.
"Threads will only give you two or three years," Smith said. "Library- or model-based concurrency is the best midterm approach."
Looking into the future
EDA vendors interviewed at the 2008 Design Automation Conference (DAC) painted a more nuanced picture of the future of multithreading.
"We have not seen the limits to multithreading in the timing-analysis area," said Graham Bell, marketing counsel for EDA startup Extreme DA Corp. "We see good scaling for three or four process threads. We get to see difficulties beyond that, but they are not dramatic."
With Extreme DA's GoldTime, a multithreaded static and statistical timing analyzer, the company has applied a fine-grained multithreading technique based on ThreadWave, a netlist-partitioning algorithm. "Because of our unique architecture, we have a small memory footprint," Bell said. "We have not seen the end of taking advantage of multithreading."
For applications with a fine-grained parallelism, multithreading is one of the most generic ways to exploit multicores, said Luc Burgun, CEO of Emulation and Verification Engineering SA. "On the other hand, multithread-based programs can also be quite difficult to debug." That's because they "break the sequential nature of the software execution, and you may easily end up having nondeterministic behavior and a lot of headaches."
According to Burgun, multiprocess remains the "easiest and safest way to exploit multicore." He said he expects some interesting initiatives to arise from parallel-computing experts to facilitate multicore programming. "From that standpoint, CUDA [the Nvidia-developed Compute Unified Device Architecture] looks very promising," Burgun said.
Simon Davidmann, president and CEO of Imperas Ltd, delivered a similar message. "Multithreading is not the best way to exploit multicore resources," he said. "For some areas, it might be OK, but in terms of simulation, it is not."
Multithreading is not the only trick up Synopsys Inc.'s sleeve, said Steve Smith, senior director of product platform marketing. "Within each tool, there are different algorithms. When looking at each tool, we profile the product to see the largest benefits to multithreading," he said. "Multithreading is not always applicable. If not, we do partitioning."
As chipmakers move to eight and 16 cores, a hybrid approach will be needed, asserted Smith, suggesting a combination of multithreading and partitioning.
To illustrate the point, Smith cited a host of Synopsys' multicore solutions in the area of multithreading, "HSpice has been broadly used by our customers. This is typically the tool you do not want to start from scratch," he said.
HSpice multithreading has come in stages, noted Smith. "Last year, we multithreaded the model-evaluation piece, and it gave a good speedup. Then, in March, we introduced the HSpice multithreaded matrix solver. We want to make sure our customers are not impacted, and we do it [multithreading] piece by piece," he said.
Another trend Synopsys is investigating, Smith continued, is pipelining. This technique—an enterprise-level activity, since it demands the involvement of IT—collapses multiple tasks, such as optical proximity correction and mask-data preparation, into a single pipeline.
HSpice multithreading has come in stages.
Last year, Magma Design Automation Inc. unveiled an alternative to multithreading, using a streaming-data-flow-based architecture for its Quartz-DRC design rule checker. Multithreading provides a less-fine-grained parallel-processing capability than Magma's data flow architecture, said Thomas Kutzschebauch, senior director of product engineering at Magma.
Magma's multicore strategy is focused on massive parallelism, Anirudh Devgan, VP and general manager of the custom design business unit, said at a DAC panel session on reinventing EDA with "manycore" processors.
"Four CPU boxes are just the beginning of a trend, and EDA software has to work on large CPUs with more than 32 cores," he said. "Parallelism offers an opportunity to redefine EDA productivity and value. But just parallelism is not enough, since parallelizing an inefficient algorithm is a waste of hardware."
Devgan's conclusion was that tools have to be productive, integrated and massively parallel.
Seeing beyond C
As he unveiled "Trends and What's Hot at DAC," Smith expressed doubts about C as the ultimate language for multicore programming. He cited the identification of a new embedded-software language as one of the top 10 issues facing the industry this year, and asserted, "a concurrent language will have to be in place by 2015."
EDA executives did not debate the point. "We will change language over time," stated Joachim Kunkel, VP and general manager of the solutions group at Synopsys. "We are likely to see a new language appear, but it takes time. It is more an educational thing."
On the software side, meanwhile, reworking the legacy code is a big issue, and writing new code for multicore platforms is just as difficult. Nonetheless, Davidmann held that "the biggest challenge is not writing, reworking or porting code, but verifying that the code works correctly, and when it doesn't, figuring out how to fix it. Parallel processing exponentially increases the opportunities for failure."
Traditionally, Davidmann said, software developers think sequentially. Now, that has to change. Chip design teams have been writing parallel HDL for 20 years, so it's doable—though it will take much effort and new tool generations to assist software teams in this task.
With single-processor platforms and serial code, functional verification meant running real data and tests directed to specific pieces of functionality, Davidmann said. "Debug worked as a single window within a GNU project debugger."
But with parallel processing, "running data and directed tests to reduce bugs does not provide sufficient coverage of the code," he said. "New tools for debug, verification and analysis are needed to enable effective production of software code."
Davidmann said Imperas is announcing products for verification, debug and analysis of embedded software for heterogeneous multicore platforms. "These tools have been designed to help software development teams deliver better-quality code in a shorter period of time," he said.
To simplify the software development process and help with the legacy code, Burgun said customers could validate their software running on the RTL design emulated in EVE's ZeBu. It behaves as a fast, cycle-accurate model of the hardware design.
For instance, he continued, some EVE customers can run their firmware and software six months prior to tapeout. They can check the porting of the legacy code on the new hardware very early and trace integration bugs all the way to the source, whether in software or in hardware. When the engineering samples come back from the fab, 99 percent of the software is already validated and up and running.
Thus, "ZeBu minimizes the number of respins for the chip and drastically reduces the bring-up time for the software," Burgun said.
- Anne-Francoise Pele
EE Times
Monday, July 14, 2008
Nokia urges consumers to recycle old phones
Only 3 percent of people recycle their mobile phones despite the fact that most have old devices lying around at home that they no longer want, according to a global consumer survey released by Nokia. Three out of every four people added that they don't consider recycling their devices and nearly half were unaware that it is possible to do.
The survey is based on interviews with 6,500 people in 13 countries including Finland, Germany, Italy, Russia, Sweden, United Kingdom, United Arab Emirates, United States, Nigeria, India, China, Indonesia and Brazil. It was conducted to help Nokia find out more about consumers' attitudes and behaviors towards recycling, and inform the company's take-back programs and efforts to increase recycling rates of unused mobile devices.
"It is clear from this survey that when mobile devices finally reach the end of their lives that very few of them are recycled," said Markus Terho, director of environmental affairs, markets, at Nokia. "Many people are simply unaware that these old and unused mobiles lying around in drawers can be recycled or how to do this. Nokia is working hard to make it easier, providing more information and expanding our global take-back programs." He added, "If each of the 3 billion people globally owning mobiles brought back just one unused device we could save 240,000 tons of raw materials and reduce greenhouse gases to the same effect as taking 4 million cars off the road. By working together, small individual actions could add up to make a big difference."
'Unaware' of recycling
Despite the fact that people on average have each owned around five phones, very few are being recycled once they are no longer used. Only 3 percent said they had recycled their old phone. Yet very few old devices, 4 percent, are being thrown into landfill. Instead, the majority, 44 percent, are simply being kept at homes never used. Others are giving their mobiles another life in different ways, one quarter are passing on their old phones to friends or family, and 16 percent of people are selling their used devices particularly in emerging markets.
Globally, 74 percent of consumers said they don't think about recycling their phones, despite the fact that around the same number, 72 percent think recycling makes a difference to the environment. This was consistent across many different countries with 88 percent of people in Indonesia not considering recycling unwanted devices, 84 percent in India, and 78 percent of people in Brazil, Sweden, Germany and Finland.
One of the main reasons why so few people recycle their mobile phones is because they simply don't know that it is possible, revealed the survey. Up to 80 percent of any Nokia device is recyclable and precious materials within it can be reused to help make new products such as kitchen kettles, park benches, dental fillings or even saxophones and other metal musical instruments. Globally, half of those surveyed didn't know phones could be recycled like this, with awareness lowest in India at 17 percent and Indonesia at 29 percent, and highest in the U.K. at 80 percent and 66 percent in Finland and Sweden.
Green efforts
"Using the best recycling technology nothing is wasted," noted Terho. "Between 65-80 percent of a Nokia device can be recycled. Plastics that can't be recycled are burnt to provide energy for the recycling process, and other materials are ground up into chips and used as construction materials or for building roads. In this way nothing has to go to landfill."
Many people interviewed for the survey, even if they were aware that a device could be recycled, did not know how to go about doing this. Two thirds said they did not know how to recycle an unwanted device and 71 percent were unaware of where to do this.
Nokia has collection points for unwanted mobile devices in 85 countries around the world. People can drop off their old devices at Nokia stores and almost 5,000 Nokia Care Centers.
Responding to the survey findings, Nokia is developing a series of campaigns and activities to give people more information on why, how and where to recycle their old and unwanted devices, chargers and mobile accessories. The company is also expanding its global take-back program by adding more collection bins and promoting these in store to raise greater awareness.
More firms endorse Symbian Foundation
The initial board members of the Symbian Foundation have welcomed the continuing support from mobile industry leaders for their plans for the initiative and the evolution of Symbian OS as an open platform for mobile innovation.
Plans for the Symbian Foundation were announced on June 24 with initial board members; AT&T, LG Electronics, Motorola, Nokia, NTT Docomo, Samsung Electronics, Sony Ericsson, STMicroelectronics, Texas Instruments and Vodafone, together with Symbian Ltd. An additional 11 organizations supported the announcement that day and nine more confirmed their endorsement of plans for the Symbian Foundation, including mobile operators 3, América Móvil and TIM; semiconductor manufacturer Marvell; and services and software providers Aplix, EB, EMCC Software, Sasken and TietoEnator.
"We were delighted with the broad support for plans for the Symbian Foundation", said Kai Öistämö, executive VP, devices at Nokia, on behalf of the initial board members. "We believe that this is a significant move for our industry and are pleased that these additional market leaders agree and are giving their support to the initiative."
Sematech: 450mm program is on track
International Sematech is moving full speed ahead with its 450mm programs, but the question is whether the industry can meet its lofty goals in building 450mm fabs by 2012.
On July 9, chipmaking consortium Sematech provided an update on its next-generation 300- and 450mm programs, saying that they are on track and making steady progress.
The consortium is up and running with its "factory integration test bed" facility for the development of 450mm fab tools. Sematech is also testing silicon wafers based on 450mm technology. And the group claims it has made progress on its so-called "Next Generation Factory" (NGF) program, geared to bring lower costs and reduced cycle times in 300mm wafer manufacturing.
Recently, Sematech unveiled two next-generation fab programs: 300mmPrime and the International Sematech Manufacturing Initiative's ISMI 450mm effort.
Need for 450mm?
There is widespread support among the fab-tool community for 300mmPrime, which looks to boost the efficiency of existing 300mm fabs, thereby pushing out the need for 450mm plants.
The newer, more controversial ISMI 450mm program, announced last year at Semicon West, calls for some chipmakers to make a more direct transition from 300mm to the larger 450mm wafer size.
Many fab-tool vendors are reluctant to endorse the next-generation wafer size or devise 450mm tools, saying that it is simply too expensive. Many vendors claim that 300mm fabs are suitable for most applications and the real goal for the industry is to improve the productivity of current plants.
"There is still a lot of concern and debate" about 450mm fabs among the equipment makers, said Scott Kramer, VP of manufacturing at International Sematech, but "the tide has shifted over the last 12 months."
A few fab-tool and materials vendors have develop 450mm technologies, but many suppliers have publically slammed Sematech's 450mm program, saying the economics simply don't add up.
However, the mood is somewhat beginning to change, especially when Intel Corp., Samsung Electronics and Taiwan Semiconductor Manufacturing Co. Ltd in May reached an agreement on the need for industry collaboration for 450mm wafers starting in 2012. Intel, Samsung and TSMC indicate that the semiconductor industry can improve its return on investment and reduce 450mm research and development costs by applying aligned standards, rationalizing changes from 300mm infrastructure and automation, and working toward a common timeline.
Intel, Samsung and TSMC represent a major chuck of the world's capital equipment buyers. Because those companies are pushing for 450mm fabs, it could jumpstart the development of the next-generation wafer size.
Many believe that 450mm tools will not be ready in the 2012 timeframe. Even Kramer acknowledged that the 2012 timetable for 450mm fabs is "very aggressive."
"Those are risky goals," he said.
300mm vs. 450mm
To jumpstart the 450mm era, Sematech last year announced a plan to devise a "factory integration test bed" facility for the development of 450mm fab tools. The proposed facility would help enable chip-equipment makers to develop the initial fab-automation gear, such as carriers, load ports, modules and other items.
Providing an update on the "test bed," Tom Abell, 450mm program manager at Sematech, said the facility is operational. At present, Sematech has put the "test bed" at the Advanced Technology Development Facility, the consortium's former R&D foundry. Based in Austin, Texas, that facility was recently sold to SVTC Technologies Inc.
The facility is using the first 450mm handlers from Brooks Automation Inc. and carriers from Entegris Inc. The pitch specification for these tools is 10mm. With the fab-automation gear, Sematech has demonstrated a 450mm wafer running at 100,000 cycles, Abell said.
Sematech is also in the process of developing a standard for 450mm silicon wafers. At present, there are five wafer-thickness standards vying for dominance in the arena, each with their own set of "tradeoffs," Kramer said.
Initially, Sematech is exploring 450mm wafers with an overall thickness of 925-micron. Last year, Japan's Nippon Mining & Metals Co. Ltd claimed to have developed the first 450mm polycrystalline silicon wafers. Sematech is testing wafers from Nippon Mining, but the consortium is also talking to other silicon wafer suppliers, Kramer said.
NFG Program
The consortium also claims it has made progress on its 300mm NGF Program, which focuses on global infrastructure for 300mm hardware and software. It includes 300mmPrime and is supported by ISMI's four other programs in continuous improvement, 450mm manufacturing, metrology, and environment, safety and health.
"The 300mm NGF Program offers a wider look at 300mm productivity with a broader set of initiatives—and it works for companies whose business plans don't necessarily include a larger wafer size," said Kramer in a statement last year. "Our priority is to extend productivity improvements to existing 300mm fabs in addition to supporting 'green field' facilities."
The overall goal of the program is to reach a 30 percent reduction in cost per wafer, and a 50 percent reduction in cycle time. Like last year, Sematech said it has not been able to reach those targets.
In new simulation data, the consortium claims it is coming closer to its goals. It has simulated a 30- to 40 percent boost in cycle times and 10 to-15 percent improvements in cost. In other data, it has demonstrated a 60 percent boost in cycle times and a 10 percent improvement in cost.
The bottleneck remains in moving the wafer lots from one tool to another. The goal is to process wafers without any delays, according to Sematech.
- Mark LaPedus
EE Times
It's final: Nokia concludes Navteq purchase
Nokia has completed its purchase of Navteq, a digital map information provider.
As part of Nokia, Navteq will continue to develop its expertise in the navigation industry, service its customers, and invest in the further development of its industry-leading map data and technology platform. It will continue to build out and expand coverage of countries already included in its database as well as add new pieces of both static and dynamic content.
Powered by Navteq's maps data, Nokia will redefine the Internet and connected experiences by adding context—time, place, people— to Web services optimized for mobility.
"Nokia and Navteq together make a powerful combination, and customers will benefit as the transaction enables Navteq to accelerate its expansion into new regions and introduce innovative new content. This is an industry poised for further growth and Navteq will play a major role in the field," said Olli-Pekka Kallasvuo, president and CEO, Nokia. "The addition of Navteq comes at the right time for Nokia's business, allowing us to create the leading location platform just as context-aware and location-based Internet services expand rapidly into mobile communications devices."
iPhone mania continues
Aside from some very surprising component choices in key parts of the upgraded communications section, as well as some software improvements and some basic design tweaks, the old adage 'if it ain't broke, don't fix it' has clearly shaped the design of Apple Inc.'s iPhone 3G.
"It's incrementalism at play," said David Carey, president of teardown specialist Portelligent. "They learned a bit from their 'Touch' solution and replaced two boards with one."
Instead of trying to reinvent the device, Apple focused on enhancing the user experience and expanding its fan base. It has done this by not only expanding its geographical footprint and speeding up its wireless connection, but also through the iPhone applications development network. Clearly one of the bigger stories behind the 3G launch, the developer program will see Apple providing resources, real-time testing and distribution, to accelerate the delivery of more diverse applications to the consumer.
The success of that program will be determined over the coming months, but for those consumers with an iPhone 3G in hand, there will be little to 'oooh' and 'aaah' about, aside from the 3G data rates, where available.
Surprise design wins
From the outside, the phone looks very much the same, except for a plastic backing and a move away from a recessed headphone jack to a flush connector. It has the same look and feel and the same 2Mpixel camera feature. That said, it does add built-in GPS capability and MobileME application software.
Analysts from Portelligent, as well as TechOnline and Semiconductor Insights, were taken aback by the strength of Infineon's wins in the 3G communications portion, as well as the inclusion of TriQuint for three front-end modules.
"Infineon clearly made their mark on this board with four key design wins," said Allan Yogasingam, a TechOnline technology analyst. "And TriQuint really came from left field with their win their modules. I didn't see a single press-release or speculative article hinting at a relationship between the two companies. In today's internet world, that's a tough thing to keep under wraps."
TriQuint provides three power-amplifier (PA) front-end modules, the first is the TQM676021, which is an integrated 3V linear UMTS Band 1 PA, duplexer and transmit filter module, with output power detector. It supports high-speed uplink packet access (HSUPA) operation with transmission data-rates up to 10Mbit/s. Next is the TQM666022, a similar device, but for Band 2 operation. Finally comes the TQM616035 W-CDMA/HSUPA PA-duplexer module for Band 5 and 6.
Moving up the signal chain, Infineon won big. It supplies the UMTS transceiver, suspected to be the PMB 6952, as well as the baseband processor, which is actually a two-chip module in a single package. The first chip is the X-Gold 208 (PMB 8877), which caters to GSM/GPRS/Edge waveforms. The second chip is marked the PMB 8802 and is suspected to be the W-CDMA/HSDPA accelerator for 3G. While there's still some debate as to whether this combo package with Apple markings may in fact be Infineon's XGold 608 (PMB 8878), which TechOnline product manager Greg Quirk Quirk and analyst Allan Yogasingam expected to see, that chip has as yet not become available to verify under Semiconductor Insights' microscope.
"In any case, that it's broken into two chips is surprising," said Carey, given that both Nokia and Qualcomm have integrated both functions it into a monolithic die. However, there may be more to the decision than design choice. "We suspect the second die has something to do with one of the InterDigital patents," said Yogasingam, referring to an Apple, InterDigital patent dispute last year.
The baseband's support memory comes courtesy of Numonyx, the Intel/STMicroelectronics spin-off. It includes 16Mbyte NOR flash and 8Mbyte pseudo-SRAM (PF38F3050M0Y0CE).
Rounding out the communications function is the Skyworks SKY77340 824- to 915MHz quad-band GSM/Edge amplifier module, the same part used in the original iPhone.
Power management in the iPhone 3G is split between two ICs: the communications portion of the device is handled by Infineon's SMARTi Power 3i, while system-level power control and management is handled by NXP (exact device to be determined, though Carey believes it's the #PCF50633, as per the original iPhone.)
The Linear Technology LTC4088-2 takes care of battery charging and general USB power control.
Built-in GPS
Aside from 3G capability, one of the big differentiators of the new iPhone device is its built-in GPS capability, which is provided by yet another Infineon chip, this time the PMB 2525 Hammerhead II. "In the old one [original iPhone], GPS was software enabled and was accurate to within blocks," said Quirk. "This time it's accurate to within meters."
The Hammerhead II integrates an assisted-GPS (A-GPS) baseband processor with a low-noise GPS RF front end and multi-path mitigation to avoid large errors in urban environments. While the die markings indicate it's actually a PMB 2520 Hammerhead I chip, Quirk pointed out that it's common practice take the same die, make some fairly simplistic connection or routing changes to alter or improve functionality, and then re-label it as a 'new' chip.
Memory support
For the main applications processor, Apple chose to stick with a tried-and-true Samsung ARM11-based design, with some tweaks, supported by 128Mbyte stacked, package-on package, DDR SDRAM, also from Samsung. Externally, the main memory comes in two versions for the iPhone: 8Gbyte and 16Gbyte NAND flash. In this case, it is 8Gbyte, but the source was surprising: Toshiba, in the form of a single-chip device segmented into four, 2Gbyte die (TH58NVG6D1D).
According to Quirk, the choice of Toshiba was unusual given that Apple had a "huge" deal to buy all Samsung memory. It also was reportedly discussing plans for volume purchase of NAND flash chips that will be used in all iPods and iPhones from June to December 2007 (Source: EE Times-Asia.)
"To see Toshiba makes me wonder if that deal is no longer in place," he said. Granted, those deals are aging, he acknowledged, "but now that the new iPhones have come out and seem to be using Toshiba, does this mean that Samsung is playing second string to Toshiba? It could mean some good stock boost for Toshiba!"
With regard to the current 16Gbyte maximum offered with the iPhone 3G, Quirk suspects that may not be enough, given that half a gigabyte can disappear for just one compressed movie. Add photos and MP3 files and Quirk sees that 16 Gbytes getting eaten up pretty fast.
The SST25VF080B 8Mbit serial flash chip from SST rounds out the iPhone 3G's memory support.
The tried-and-true philosophy symbolic of the new iPhone extends to the accelerometer, the LIS331 DL from ST, as well as the single-chip 88W8686 single-chip Wi-Fi offering from Marvell. The Marvell chip is accompanied on the back of the main board by a CSR BlueCore6-ROM Bluetooth chip, which surprised the analysts, all of whom were expecting to see the same BlueCore4 device used in the original iPhone.
Rounding out the main chips on the iPhone are the Wolfson WM6180C audio codec, which replaces the WM8758 used on the original iPhone, as well as the Broadcom BCM5974 touchscreen controller, National Semiconductor LM2512AA Mobile Pixel Link display interface and the Texas Instruments #CD3239 touchscreen line driver.
The new iPhone's touchscreen approach is the same as that of the iPod Touch, said Carey. The Gen1 iPhone had three chips for the touch screen solution: a Broadcom controller, a NXP 32bit uP, and a TI line driver. The Touch reduced this to just a revised Broadcom chip (which absorbed the microprocessor function) and the TI line driver. "The 3G uses the same Broadcom chip as the Touch, and an updated TI line driver (smaller chip)."
While Apple's rollout of the iPhone 3G may not have been met with the same frenzied reception as the original, its fan base remains strong, according to Yogasingam. "After spending the better part of the night with people waiting in line for an iPhone, I'm still amazed at how many people have embraced the Apple brand and are willing to do anything to be an early adopter of anything hip and new from Apple. Apple has this air with its fan base that it could do no wrong."
- Patrick Mannion
TechOnline
Wednesday, June 25, 2008
Mobile WiMAX faces future setbacks
Mobile WiMAX may become a spent technology even before it gains any commercial traction, a market research group warned.
According to Frost & Sullivan, unless spectrum auctions and commercial Mobile WiMAX rollouts, which are compliant to Wave 2 Phase 2 certification, gather momentum before the end of 2008, the market scope for the broadband wireless technology "will be insignificant."
The researchers added the technology is facing a number of challenges that are likely to make it unfeasible as a mobile "access" technology.
However, they counteracted the bleak analysis by noting that the huge investment that has gone into mobile WiMAX may not have been for nought. The group believed that the work carried out on mobile WiMAX has the potential to spur new ventures, which could potentially lead Mobile WiMAX to merge with 3G LTE.
"Recent events have been unfavorable toward Mobile WiMAX," said Luke Thomas, program manager, Frost & Sullivan. He added that, "Sprint-Nextel Corp. recently announced a delay to the commercial rollout of its Mobile WiMAX service, Xohm, and has now stated that the first commercial service of Xohm will be in Baltimore in September 2008 and Washington DC and Chicago by Q4 2008, provided that the new WiMAX venture 'ClearWire' deal closes by Q4 2008."
Recent trends
Thomas said any operator looking at Mobile WiMAX has to consider the current environment where 97 percent of laptops are shipped with Wi-Fi technology.
3G LTE is expected to be a fully ratified standard by the Q4 08 or 1H 09 with deployments slated to occur in 2H 09 or Q1 10 offering peak data rates of up to 170Mbit/s.
He noted that the number of dual-mode Wi-Fi/cellphones is currently on the rise, with newer models emerging at lower costs and better battery life. He stressed that Alcatel-Lucent Technologies, Ericsson, NEC Electronics, NextWave Wireless, Nokia, Nokia Siemens Networks and Sony Ericsson recently encouraged all interested parties to join an initiative to keep royalty levels for essential LTE patents in mobile devices below 10 percent of the retail price.
What seems unresolved
"It is still unclear if members of the WiMAX Forum have reached an agreement pertaining to the IP rights they possess for Mobile WiMAX. Hence, prominent members of the Forum formed the Open Patent Alliance to address this issue," said Thomas.
He added that 2009 would be the year when operators begin to realize that Mobile WiMAX can no more be considered as a feasible mobile broadband access technology. "In terms of indoor wireless broadband, Wi-Fi fits well in this space and with the emergence of 802.11n, which includes MIMO, throughputs would be far better than what Mobile WiMAX can deliver," he noted.
With respect to outdoor mobile broadband environments, he said, users would expect Mobile WiMAX to seamlessly hand off to cellular networks in the absence of WiMAX reception. "In reality, this is not possible as this technology is not backwardly compatible with existing cellular technologies," he stressed.
At a recent WiMAX Forum workshop in Dubai, participants accepted that Mobile WiMAX is not optimized to simultaneously handle both data and voice applications as efficiently as high-speed packet access (HSPA) or 3G LTE. It is therefore unclear whether the initial client devices for Mobile WiMAX (ultramobile PCs or tablet devices) will meet with any degree of consumer receptiveness.
"While the Nokia N810 tablet will retail at $440 for Xohm users later this year, it is still ambiguous if consumers will want one mobile device for voice, based on cellular technology and another for personal broadband based on Mobile WiMAX," said Thomas, adding that, "This is relevant, considering that HSPA coupled with Wi-Fi can do both in a single mobile device."
- John Walko
EE Times Europe
New iPhone carries $173 tag, reveals virtual teardown
Apple Inc.'s second-generation iPhone announced this month is expected to carry an initial hardware BOM and manufacturing cost of $173, according to a "virtual teardown" conducted by iSuppli Corp. The virtual teardown offers preliminary analysis and estimates of the iPhone content, suppliers and cost structure.
"At a hardware BOM and manufacturing cost of $173, the new iPhone is significantly less expensive to produce than the first-generation product, despite major improvements in the product's functionality and unique usability, due to the addition of 3G communications," said Jagdish Rebello, director and principal analyst for iSuppli. "The original 8Gbyte iPhone carried a cost of $226 after component price reductions, giving the new product a 23 percent hardware cost reduction due to component price declines."
The table presents iSuppli's preliminary virtual teardown estimate of the 8Gbyte 3G iPhone's costs. It doesn't include other costs, including software development, shipping and distribution, packaging, and miscellaneous accessories included with each phone.
New business model
With the second-generation iPhone, Apple is making a significant departure in its pricing strategy.
"The original 2G phone was sold at an unsubsidized price of $499," Rebello noted. "However, at a retail price of $199 for the low-end 8Gbyte version of the new 3G model, wireless communications service carriers will be selling the product at a subsidized rate, using a common business model for the mobile handset market."
Based on iSuppli estimates, the subsidy paid by wireless carriers to Apple will be about $300 per iPhone. "This means that with subsidies from carriers, Apple will be selling the 8Mbyte version of the second-generation iPhone to carriers at an effective price of about $499 per unit, the same as the original product," Rebello explained.
For the first version of the iPhone, Apple was given a portion of the wireless carriers' revenue from service subscriptions. For the second-generation version, Apple is not getting any service revenue, making it more imperative that the company cut a profit on the actual hardware through carrier subsidies.
"Hardware is vital to Apple profits, valuation and revenue in the consumer electronics and wireless communications realms," Rebello said. "In fact, two-thirds of Apple's revenue from the iPod still is derived from hardware, while only one third is from the iTunes service and accessories. The second-generation iPhone is no exception."
iSuppli observed that Apple's iPod and iPhone products typically are priced about 50 percent more than their BOM and manufacturing costs. With the new iPhone sold at $199 and the estimated subsidy of $300, Apple will achieve an even higher BOM/manufacturing margin, noted iSuppli.
Future costs
Like all electronic products, the 3G iPhone's BOM costs will decrease over time as component prices decline. The BOM/manufacturing cost of the second-generation iPhone will decrease to $148 in 2009, down 37 percent from $173 in 2008, according to data from iSuppli's Mobile Handset Cost Model.
"If the 3G iPhone design is unchanged, the cost will decline to $126 in 2012," said Tina Teng, wireless communications analyst at iSuppli.
Nokia buys 52% Symbian shares for $410M
Nokia Corp. has released a cash offer to acquire all of the shares of Symbian Ltd that Nokia does not already own, at a price of about $5.67 per share, or approximately $410 for the 52 percent Symbian shares.
The Finnish handset maker has received irrevocable undertakings from Sony Ericsson Mobile Communications AB, Telefonaktiebolaget LM Ericsson, Panasonic Mobile Communications Co. Ltd and Siemens International Holding BV to accept the offer, representing approximately 91 percent of the Symbian shares subject to the offer. Nokia also expects Samsung Electronics Co. Ltd. to accept the offer.
Symbian is the software company that develops and licenses Symbian OS, the open OS for mobile devices. User interfaces designed for Symbian OS include S60 from Nokia, MOAP (S) for the 3G network and UIQ, designed by UIQ Technology, a joint venture between Motorola and Sony Ericsson.
The acquisition is a fundamental step in the establishment of the Symbian Foundation, announced by Nokia, together with AT&T, LG Electronics, Motorola, NTT DoComo, Samsung, Sony Ericsson, STMicroelectronics, Texas Instruments and Vodafone.
"This is a significant milestone in our software strategy," said Olli-Pekka Kallasvuo, CEO of Nokia. "Symbian is already the leading open platform for mobile devices. Through this acquisition and the establishment of the Symbian Foundation, it will undisputedly be the most attractive platform for mobile innovation. This will drive the development of new and compelling, web-enabled applications to delight a new generation of consumers."
"Ten years ago, Symbian was established by far sighted players to offer an advanced open OS and software skills to the whole mobile industry", said Nigel Clifford, CEO of Symbian. "Our vision is to become the most widely used software platform on the planet and indeed today Symbian OS leads its market by any measure. Today's announcement is a bold new step to achieve that vision by embracing a complete and proven platform, offered in an open way, designed to stimulate innovation, which is at the heart of everything we do."
Nokia expects the acquisition to be completed during the Q4 and is subject to regulatory approval and customary closing conditions. After the closing, all Symbian employees will become Nokia employees.
Coming soon: Open mobile platform
Nokia Corp.'s announcement that it would pay about $410 million for the 52 percent of Symbian Ltd it does not already own, has sealed the deal to creating the foundation that will develop an open, free-for-use platform for mobile phones.
Epoch making. Unprecedented. Setting mobile software free. These were just some of the typical superlatives used by the senior executives from Symbian, Nokia, Sony-Ericsson, Motorola, Texas Instruments, STMicroelectronics, Vodafone and others at the London conference where Nokia disclosed its planned buyout.
Well it is certainly a bold and innovative move by Nokia, one that will be a hugely significant one for the mobile software sector. "Typically selfless and self interested," as David Levin, a former CEO of Symbian and now CEO of United Business Media, the company that owns TechInsights, publisher of EE Times commented.
What a pity, then, that the assembled executives from nine of the companies involved in the shrewd, logical, grand and ambitious scheme could not, under questioning, let themselves admit that it is also a defensive move to efforts by Microsoft, Google—with its delayed Android project—and to a lesser extent Apple with the iPhone and a slew of other open source platforms such as the growing LiMO Foundation, to muscle in on the mobile phone OS sector.
Uniting Symbian OS and the S60 platform, Motorola's UIQ and DoCoMo's MOAP is likely to take some time—two years to full completion was the target mentioned—and tax the "hugely experienced and talented" engineers Nokia executive VP Kai Oistamo said he is looking forward to welcoming on Nokia's payroll.
Elements of the platform being created will be available to the developer community as soon as the Symbian Foundation is up and running, expected to be later this year or early 2009 when Nokia has competed the purchase of the shares owned by Ericsson, Sony-Ericsson, Matsushita, Siemens and Samsung in Symbian Ltd. So over the next two years, we will see the integration of the three user interface systems into one with a release of code sometime in 2011.
The Foundation—backed by five of the top mobile phone OEMs, three major operators, two chipmakers—would be "open to all comers", commented Oistamo, and does not make non-fragmentation a condition of membership and licensing, but it is more than likely, there will be a compliance brand and cold shoulder approach to any that attempt to fragment the system.
More direct input from chipmakers and network operators, as well as closer integration of the OS and user interface, should make the whole platform more stable and attractive to operators and the millions of application developers. Operators such as Vodafone have long bemoaned the fact that they have to accommodate too many platforms.
One analyst called the move "the ultimate manifestation of the 'boy scout effect,' where Nokia believes that a greater opportunity for all will result in more profits for Nokia itself than going it alone." We could not agree more with the assessment of Richard Windsor, of Nomura Securities.
Only time will tell whether this is a masterstroke that leads to total dominance or the straw that broke the donkey's back.
There are two big risks here. So far, Symbian has been a tightly controlled ecosystem where fragmentation has not been allowed to happen. This control point seems in danger of being breached. This, combined with standardization by consensus, could give proprietary systems such as Windows Mobile and Apple's an advantage in time-to-market and nimbleness.
- John Walko
EE Times Europe
Monday, June 23, 2008
mimoOn taps Nokia wireless experts
Software-defined radio (SDR) solutions provider mimoOn GmbH has added key wireless and software engineers from Nokia's research location in Bochum, Germany and Helsinki, Finland. The team includes members with backgrounds and focus in software-defined radio, baseband design, software architectures, and UMTS/Long Term Evolution (LTE) PHY and protocol layer implementation for wireless handsets and base stations.
"The mobile wireless world is transitioning to fourth-generation standards in order to support high bandwidth applications such as mobile video, multimedia, Internet browsing, and interactive gaming. mimoOn is well positioned with its 4G LTE solutions which are finding market traction throughout the ecosystem, from macrocell, picocell, femtocell and test equipment manufacturers to semiconductor providers. These Nokia engineers will help propel the company securely into its next growth phase," commented Thomas Kaiser, CEO of mimoOn.
mimoOn's solutions are optimized for low memory footprint, as well as for processing power and power consumption. The PHY, MAC, RLC, PDCP, and RRC components are all fully featured, based on the Release 8 baseline specifications from 3GPP. In addition, mimoOn's solution is entirely software defined, so it is easy to modify or customize even after being deployed—an important feature as the standards for LTE will continue to be revised well into 2009.
mimoOn will demonstrate its LTE SDR solutions at the NGMN Industry Conference in Frankfurt from June 25-27.
Wireless sensor nets hunt for full 'killer' apps
Wireless sensor networks (WSNs) are not being fully utilized in commercial markets due to the lack of a 'killer application' to drive interest, a Plextek-led report for U.K. communications industry regulator Ofcom said.
The 10-month study was commissioned to examine technology developments in WSNs, along with market growth scenarios and what would be the spectrum implications.
Plextek worked with the University of St. Andrews and TWI Ltd, a Cambridge-based independent research and technology organization, on the research, which found that it is the traditional sensing applications that are currently commercially exploiting the advantages of WSNs.
The popularity of WSNs
The report suggested that WSNs may become more widely deployed over the next three to five years, with systems continuing to adopt existing license-free bands including 13.56MHz, 433MHz, 868MHz and 2.4GHz. The main issue for WSNs will be the band crowding, especially 2.4GHz with the increased Wi-Fi use.
Ofcom commissioned the study as part of its "Tomorrow's Wireless World" R&D program into the future of communications technology.
"Our research produced some very interesting conclusions," said Steve Methley, senior consultant, Plextek. "The lack of a killer application may be due to limiting factors such as current cost of wireless nodes and lack of understanding or experience by end users, especially regarding 'real-world' reliability," he added.
Methley noted that there is also a need for further improvements in batteries and energy scavenging technologies.
Suggested strategies
"One possible movement toward a killer application is to let major systems integrators get involved," said Methley. "Such players will increasingly come on board when there is a need to take a professional approach to defining, installing and maintaining substantial wireless sensor networks," he added.
The study suggested that while existing unlicensed spectrum can adequately support WSNs, "a dramatic increase in use could prove problematic."
"Typical radio protocols such as the popular 802.15.4 standard are designed to be polite and to check for clear channels before transmitting," it noted. This may become a problem when bands become crowded. The standard suffers because of its politeness in the face of increasing Wi-Fi usage, particularly in the streaming applications. This may lead to the appearance that WSNs are unreliable, an important issue as the perception of unreliability is one of the key barriers identified for WSN adoption.
- John Walko
EE Times Europe
Freescale channels embedded
It takes a strong ego to climb aboard a large, well-established, well-liked company that's in distress and think you can turn it around. Ninety days into his tenure, Rich Beyer, the no-nonsense, straight shooting chairman and CEO of Freescale Semiconductor Inc., still thinks he has what it takes. Beyer used the recent Freescale Technology Forum to layout just how he plans to do it.
While it was no surprise to hear the new boss will continue to emphasize automotive, a sector in which Freescale has always done exceptionally well, he also took great pains to hammer home the centrality of embedded.
"Freescale, at its heart, is an embedded-processor company," Beyer said. "We serve those applications with additional functionality, such as analog, sensors, RF and software. That strength in processors is the essence of where our company is leading to," he added.
Underscoring that message, Freescale used the four-day conference to the QorIQ, a multicore processor designed to enable advanced network processing, or what Freescale marketers call "the Net Effect."
The prospect for innovation
"The processor's flagship device will have eight cores, although it will take some time to develop the tools to support that level of functionality," said Lisa Su, chief technology officer, Freescale.
Beyer and his team of executives and technologists spoke in the forum as one on how the company's strengths and customer relationships will form the paths to a turnaround. Along with embedded, automotive will continue to be an essential element, especially as hybrid vehicles raise the electronics bill of materials. Cited as crucial to Freescale's future are also wireless, analog and sensors, combined with a new emphasis on the fast-turnaround consumer market in general and the "green," health and network-processing areas in particular.
Learning from experience
While the new QorIQ processor signifies the top-end of the embedded spectrum, the company is no less passionate about the lower-end, microcontroller segment. "We have a God-given right to be a leader in microcontrollers," said Henri Richard, senior VP and chief sales and marketing officer, referring to Freescale's long history in the space. Richard noted, however, that through error, the company eventually ceded ground in the 8bit arena and made some poor distribution-channel decisions.
In an interview during the opening day of the forum, Beyer had no illusions about the task ahead of him. "Clearly, we have had a series of challenges over the past several years and have shared the pain with former parent Motorola," he said.
But the new boss believed that his experience as CEO of Intersil Corp. and Elantec Semiconductor leaves him well equipped for the road ahead. When asked about his knowledge about the market and the product families, he affirmed he has dealt with these issues before so they will come handy. He described Freescale as in good, if not great, shape, adding, "This is a company that I want to be successful."
What must be done
Three months on the job have led Beyer to conclude that Freescale to be successful, it must focus on its strengths and change its business practices and models in some areas right down to its roots.
"We do not have deeply in our gene pool the DNA for consumer products," he said. "We need to invest in markets that will see return in 18 to 24 months, but we also need to invest with more stable returns," he added. The marriage of the company's i.MX processing platform with the recently acquired SigmaTel Inc.'s low-cost analog/mixed-signal expertise is a move in that direction.
While SigmaTel hit it big with its iPod design win, Beyer is not betting the bank on a repeat. "We're not depending on hits, since you need an awful number of strikeouts to get those home runs," he said. "But we're in enough applications that we'll have many singles," he added.
Besides directing his business team to come with a three-level product-development plan that calls for near-, mid- and long-term revenue returns, Beyer is also looking to improve on customer execution, in part by not overcommitting the projects. "We try to do way too many things: It leads to failure," he said.
The combination of a push deeper into embedded and a strong integration story puts the company right up against Texas Instruments Inc. (TI). "TI has recently started to talk about itself in the context of embedded processors, but they're not really an embedded-processor company," said Beyer. "That's never been a central market for TI," he added.
What others think
TI begs to differ. When asked about Beyer's comment, Mark Dennison, VP, strategic marketing, TI, argued that the company has been shipping processors into embedded applications such as base stations, voiP equipment and software-defined radio. TI also has a strong microcontroller line, signified by the MSP430.
"We've shipped a few billion ARM cores, and in February we launched the 3500 series," Dennison said. "I'm a little confused as to where Freescale is coming from," he added.
Jeff Bier, president of research and analysis firm BDTI, agreed. "To say TI is not embedded is ridiculous," he said. To Will Strauss, president of Forward Concepts, it's all semantics, given the wide range of "embedded processor" definitions. "Put a Pentium in anything besides a PC, and it's embedded," he stressed.
Dennison also commented on Freescale's claimed integration and analog advantage, pointing to TI's power management, amplifier RF and converter lineup. "We can integrate all those technologies," he said, "whether it is on stacked dice, multichip modules or package-on-package."
- Patrick Mannion
EE Times
ABI: Wireless HDTV to hit 1M installations in 2012
After wireless phones, wireless Internet and wireless home networks, the next attraction coming to the living room is wireless HDTV. At present, the market is still in its incubation stage, with fewer than 100,000 devices expected to ship this year.
According to a study from ABI Research, optimistic forecasts point to 2012 as the earliest year for having 1 million wireless HDTV installations worldwide.
Meanwhile, a "battle of technologies" is being fought. There are three contending systems, loosely characterized as 5GHz, 60GHz and UWB.
"5GHz technology is better understood and more proven," says principal analyst Steve Wilson, "but achieving the required data rates requires new approaches and more complex solutions. UWB technology has bandwidth advantages at in-room distances but drops rapidly at greater ranges. 60GHz allows high data rates, but so far only one company is even close to a viable solution."
Small numbers of 5GHz and UWB devices are currently shipping; demo products of 60GHz systems are expected early next year.
"Over the next two to three years, we're going to see one or two of these wireless HDTV approaches emerge as the primary ones," says Wilson.
Who would want wireless HDTV and why? Wireless will simplify some installations and allow more flexibility in positioning TVs. There are both commercial applications—digital signage, for example—and domestic applications such as wall-mounting a flat-screen HDTV. "The initial sweet spot in the market is where wired installation would be difficult or complicated," says Wilson.
All the wireless HDTV silicon vendors are venture-backed startups and most established wireless vendors are waiting to see how the market evolves. Product manufacturers are moving forward with different strategies: some, like Westinghouse and Belkin are initially targeting commercial and custom installers where there is clear value-add. In contrast, some TV manufacturers such as Sharp and Hitachi are targeting buyers of their latest technology, offering design-oriented products with a wireless connectivity option.
Monday, June 16, 2008
LTE subscribers to boom to 32M in 2013
There will be over 32 million subscribers using Long Term Evolution (LTE) networks by 2013, just three years after it is expected to go commercial, forecasts ABI Research. Three of the largest mobile operators—China Mobile, Vodafone and Verizon Wireless—have announced plans to adopt LTE.
Asia Pacific will account for the largest regional share. "ABI Research anticipates about 12 million Asia-Pacific LTE network subscribers in 2013," says senior analyst Nadine Manjaro. "The remainder will be split about 60-40 percent between Western Europe and North America"
Moreover, LTE commitments from NTT DoCoMo and KDDI in Japan are said to further boost adoption.
The long wait for the China government to issue 3G licenses may also become a factor driving LTE in that country.
"It wouldn't surprise me to see some operators skip over 3G and go straight to LTE," says Manjaro. "Although China's own TD-SCDMA 3G technology will be deployed on a small scale during the [Beijing] Olympics, I can't see operators spending billions to implement that or any other 3G technology if they will just have to upgrade within a year or two."
Since LTE deployment involves new hardware and software, several industry sectors stand to benefit. Before 2010, it will be vendors of the test equipment used to ensure network interoperability and performance. Next will come vendors of the required network infrastructure equipment itself. Finally, it will be device manufacturers.
Because LTE is primarily about data, not voice, its first phase will see devices such as USB dongles for PCs: ABI estimates 53 million to ship by 2013. Because LTE will compete directly with cable and DSL services, in-home modems will also see volume shipments, as will mobile Internet devices and ultramobile PCs. Manjaro calls the device market "a huge opportunity."
Enhance VoIP telephony with HD Voice
By Daniel Hartnett
Infineon Technologies AG
Do you remember hearing FM radio for the first time, or listening to your first CD after years of scratched Vinyl? That's the experience high-definition (HD) sound brings to a telephone. As VoIP becomes commoditized, the focus of system developers and service providers shifts from providing VoIP to providing higher-quality VoIP.
Taking advantage of the strong marketing behind HDTV, HD-sound is now the accepted brand name for Wideband Voice. This allows service providers to offer superb and pristine audio quality over their IP phone-enabled home gateways. The traditional "narrowband" telephony was a compromise between speech intelligibility and data rates, providing an acoustic bandwidth of 300Hz to 3.4kHz. In contrast, HD-sound uses wideband technologies to offer a transmission range of 50Hz to 7 kHz or beyond.
The result is significantly increased intelligibility and a much more natural sound not only for voice conversation, but also for a range of other audio applications, such as MP3 and Internet radio. This article attempts to address the hurdles associated with delivering HD performance in telephony, and explore its market potential.
Wideband telephony
"Wideband" telephony specifies a transmission range of 150Hz to 6.3kHz. While this is not CD bandwidth (20Hz up to 20kHz), the increased bandwidth compared to narrowband offers significantly improved intelligibility.
Wideband telephony was standardized for ISDN with the G.722 codec about 20 years ago, but never really enjoyed wide deployment. G.722 however did make its way into journalism, where wideband with G.722 is often used for voice transmission from remote locations as an alternative to the poor quality of standard telephones.
As IP-phones already have powerful signal processing capabilities for narrowband speech compression algorithms, wideband codecs can easily be handled by the voice engines within IP-phones. If the ADCs and DACs support a 16kHz sampling rate, wideband telephony on an IP-phone comes with relatively low additional overhead.Another factor driving the development of wideband telephony is the new DECT standard CAT-iq, which also specifies G.722 as the required codec for HD Voice.
PC soundcards support 8- and 16-, 32-, 44.1- and 48kHz samplings rates, and generally have the necessary processing power for wideband codecs. PC-based soft-phone applications like Skype already have a huge footprint in the market.
Most enterprise IP-Phones like Siemens' OpenStage series already support wideband. The enterprise market for wideband is an excellent proof of concept as it is much easier to control the hard and software running on the end points. The deployment of HD voice in the residential space is much more difficult. Wideband requires that both parties in a call have wideband capable hardware and that the phone immediately shifts up to the best codec available.
In the past VoIP had to contend with a less than solid reputation. From its early days where only brave pioneers would make a connection over the internet, broadband users have been fast to take up the offerings of new players on the voice service provider market. The traditional trade-off was quality against price.
Today, VoIP quality has improved beyond recognition and is easily comparable to that of POTS services. As available bandwidth and processing power of customer premise equipment becomes the norm, the possibility of using more bandwidth for vastly improved voice quality is very real and imminent. This is where providers can differentiate their services.
HD VoIP
VoIP is not just VoIP. HD Sound makes it marketable above and beyond price. A POTS phone call is thin and almost monotone in comparison to a well-implemented HD Sound call. This leads to a "warmer" sounding phone call, where all the nuances of our voice are captured. Mistaking "s" for "f" is now a thing of the past. The possibilities that this brings are manifold. The hurdles associated with bringing it to a wide audience are also considerable.
To optimize their Wideband implementations, it is vital for phone manufacturers (fixed and cordless) to adhere to some important rules: The electro-acoustic components, especially the handset receiver or the hands-free loudspeaker have to be able to reproduce the whole wideband frequency range with low distortion and high fidelity in their respective mountings.
This poses huge challenges to the device designers, especially for devices with a small form factor like cordless or mobile phones. First-class voice quality does come at a price, but one assumes that a mass market for the application will regulate this.
On the speakerphone side, the following is key. It is advisable to encase the speakerphone in order to avoid echo within the housing and to emphasize the lower frequencies like a home Hi-Fi speaker, which is also completely sealed.
In any VoIP phone (narrow or wideband), delay is the most difficult hurdle to overcome in the quest for full-duplex performance. The human ear is insensitive for the echo that immediately follows the spoken word. Otherwise you would always hear a strong echo inside any given room.
But the higher the delay between one's speech and its echo, the more sensitive the ear becomes. That's why you always hear echo in a church. In a standard IP network packet delays of more than 100ms are possible—that's one BIG church.
For this reason additional effort has to be spent to reduce echo. The echo cancellation inside a phone behaves like the human ear. It cancels echo by estimating, calculating and subtracting the result from the microphone signal. This can be a difficult job as it must work in any environment where a phone can reside.
Added markets
HD Voice opens a myriad of possibilities for system vendors and service providers to access new markets.
Interactive voice response: Can you imagine trying to book a flight with the aid of a call service using pre-recorded voice samples? Hardly. Today's voice-activated services mainly serve to drive people mad, unable to understand even the slightest delta to the trained version of the word.
With Wideband the nuances in the human voice can be captured more easily and make voice-activated services a viable market with huge potential. Not only could we upgrade our broadband or phone services without actually speaking to anybody but booking a flight, a hotel or a train all become real possibilities.
Speech recognition systems will also benefit from increased bandwidth and provide a better recognition rate, especially because sibilants can be recognized much better. (Sibilant is the "s" sound we make when we talk - in this respect the letter "f" is often mistaken for "s" in a narrowband call)
A text-to-speech (TTS) system converts normal language text into speech (using synthesized speech). The quality of a speech synthesizer is judged by its similarity to the human voice, and by its ability to be understood. An intelligible text-to-speech program allows people with visual impairments or reading disabilities to listen to written works on a telephone or PC.
Automatic translation: Voice samples are translated to text in real time
Automotive speech recognition: Uses voice to command various functions in a car (wipers, radio, windows etc. not to drive it though!)
Speech Biometric Recognition: Speaker dependant Authentication. Possible applications could be in workplaces or anywhere that requires some sort of identification.
Dictation
Hands-free computing: Speech recognition for commands on a PC for disabled users.
Home automation: Uses voice to command things we usually need a switch to do. E.g. Close the shutters, turn off the lights, turn on the heating
Medical transcription : The practice of Modern Medicine dictates that physicians spend more time serving patient needs than creating documents in order to make financial ends meet. More modern methods of document creation are being implemented through the technology of computers and the internet. Voice Recognition (VR) is one of these new-age technologies. With the power to write up to 200 words per minute with 99 percent accuracy Voice Recognition has freed physicians from the shackles of traditional transcription services.
Web Radio on a cordless VoIP phone: The Bandwidth provided by today's broadband connections is more than adequate to drive Wideband down to the residential end user. To this end the DECT forum has initiated CAT-iq (Cordless Advanced Technology -Internet and Quality), a new cordless phone standard to tap into the potential of Wideband in VoIP end points.
Several steps are envisaged:
HD Voice in Cordless Phones: Vendors are striving to bring new products to the market that support HD voice. As discussed earlier this means upgrading the phones to include improved microphones and speakerphones to get the most out of the wideband codec.
Conferencing in Wideband quality: With improved hardware, it will be possible to add new features like 3 party conferencing in pristine quality, bringing a whole new experience to the user.
Web Radio: As part of rolling out new services, future CAT-iq products will support things Like News-Tickers and more noticeably Web Radio in HD quality. This promises to be the killer application for VoIP in the home, marrying the power of the Internet with HD audio quality. Now Irish people in Australia can listen to Radio Cork and Chinese in Munich can listen to Shanghai FM without launching their PC down in the basement.
Streaming audio content: CAT-iq will enable Cordless equipment vendors and service providers to enter markets previously the domain of the Hi-Fi specialists. Audio speakers containing a DECT receiver would be the perfect solution to distribute audio content around the home and even between different floors in the home. Not only is the air interface stable but also has optimal power consumption for this application.