Breaking News

Wi-Fi news from LitePoint

Why UWB Certification Matters
Looking back at the history of great wireless innovations, the common factor that has launched technologies on a successful path is the broad ecosystem built around them. The success that Wi-Fi and Bluetooth® have enjoyed over the past 20 years is mostly due to the fact that devices from any maker or brand simply work together. An incomplete, erroneous or proprietary implementation of a standard degrades device performance.  In the worst-case scenario, the lack of compatibility completely prevents device-to-device communications. Conversely, the assurance of interoperability increases consumer confidence and ensures enhanced user experiences.

Industry consortia such as the Wi-Fi Alliance® (for Wi-Fi) or Bluetooth SIG® (for Bluetooth®) have ensured device’s interoperability through technical requirements, test specifications, and certification programs. These consortia are made of diverse companies working together towards the common goal of increasing and encouraging a widespread adoption of their respective technologies. The FiRa™ Consortium has undertaken this task for Ultra-Wideband with the mission to bring seamless, secure, and precise location awareness for people and devices. FiRa unites key industry players working together to deliver the building blocks needed to ensure a broad adoption of UWB, with over 120 members, including all the top handset manufacturers, plus market leaders across chipsets, networking, secure access, and consumer technology. LitePoint was amongst the first companies to join FiRa at its inception in 2019.  An important aspect of the consortium’s mission is the development of specifications and certification programs fostering interoperability among UWB chipsets and UWB-enabled devices, and solutions.

Source: FiRa Consortium

FiRa Certification Program
In October 2021, FiRa launched its certification program aimed at driving interoperability between UWB devices. For a device to be FiRa Certified™ and display the FiRa Certified logo, it must meet the FiRa-specified MAC and PHY Conformance Test Specifications and the MAC/PHY Interoperability Test Specification. For certification, FiRa members submit their devices to an independent authorized test lab (ATL) and follow the process described on the consortium’s website.

Source: FiRa Consortium

PHY Conformance Testing
UWB RF Physical layer (PHY) conformance verification is an important foundational step to drive an interoperable device ecosystem. The PHY Conformance test validates that chipsets and devices conform to FiRa’s UWB PHY Technical Requirements and PHY Conformance Test Specification.

FiRa’s PHY Technical Requirements are based on the IEEE 802.15.4 standard and the 802.15.4z amendment. The requirements further build on what the IEEE has already established for HRP (High Repetition Pulse) mode to define a profiled feature set and performance requirements.

Source: FiRa Consortium

FiRa’s PHY Conformance Test Specification defines test cases that verify UWB devices’ conformance to the PHY Technical Requirements. It includes UWB device’s transmitter and receiver tests to validate compliant formatting and reception of UWB frames.

Transmitter conformance tests:

  • Transmitter frame formatting validation: SYNC, SFD, STS, PHR, DATA, CRC
  • Power Spectrum Density mask verification
  • Baseband impulse response
  • Carrier frequency tolerance
  • Pulse timing verification
  • Transmitter signal quality

Receiver conformance tests:

  • Receiver sensitivity
  • Receiver first path dynamic range
  • Receiver frame decoding verification: SYNC, SFD, STS, PHR, DATA, CRC
  • Receiver Dirty packet test (negative testing)

UWB Devices Under Test (DUT) must successfully pass all transmitter and receiver test cases for FiRa certification.

FiRa PHY Conformance Test Tool (PCTT)
LitePoint’s PHY conformance test platform has been developed based on the PHY Conformance Test Specification and validated by the FiRa Consortium. This test platform can be selected by ATLs for PHY Conformance Certification testing or by FiRa members as part of pre-certification testing. This complete solution includes both the hardware platform as well as test programs to control the tester and Device Under Test (DUT) using the UWB Command Interface (UCI).

The Complete PCTT setup is comprised of the following elements:

  • IQgig-UWB platform: it integrates UWB signal generation and analysis and is capable of performing all DUT transmitter and receiver test cases.
  • IQfact+ software: it provides complete test automation including tester control, DUT control, and data collection. Pass/Fail results are provided for each test case along with detailed test logs that can be used for troubleshooting.
  • The UWB DUT is controlled using the UWB Command Interface (UCI).
  • The physical connection between the PCTT and the DUT is a vCOM interface as specified by FiRa

UWB Testing Beyond Certification
Beyond certification, the IQgig-UWB platform can be deployed in R&D for conducted or over-the-air (OTA) device characterization as well as high-volume production, making it the perfect platform to enable a cost-effective, seamless transition from the lab to production.

To learn more about FiRa’s certification process watch a replay of our webinar co-hosted by Comarch, FiRa and LitePoint:

To learn about our turnkey PCTT certification solution:

Wi-Fi 7: Wi-Fi’s Plaid Mode

With the 802.11be standard (a.k.aWi-Fi 7), Wi-Fi has gone to plaid!As hinted at by its name, the IEEE 802.11be EHT “Extremely High Throughput” standard is primarily aiming to provide super-fast data rates for the next generation Wi-Fi 7 devices. The IEEE’s Project Authorization Request(1) (PAR) document sets the impressive goal of supporting a maximum throughput of at least 30 Gbps while improving worst case latency and jitter. These performance objectives are driven by the demands of next generation applications such as high-definition video, industrial automation, immersive experiences, and gaming. To power this WLAN (r)evolution and address future innovative use cases, the 802.11be standard relies on the physical layer (PHY) with key enhancements and new features such as doubling the channel size and increasing the modulation order.

In this blog, we will examine the applications driving the next generation Wi-Fi and their performance requirements as well as the key PHY changes necessary to meet the target performance objectives set by the 802.11be standard.

What drives the need for speed? 

With Wi-Fi 6 and 6E high-performance access points and client devices just rolling out, one can wonder why even higher performance is needed. But it is anticipated that by the time Wi-Fi 7 reaches maturity, emerging applications in the Extended Reality (XR) sphere will drive the need for even higher throughput. Uses cases in Augmented Reality, Mixed Reality and Virtual Reality fall under the XR umbrella and encompass fields as diverse as gaming, industrial, health care and enterprise settings. High throughput content delivery is required to ensure excellent user experience and drive adoption of these new applications, but more importantly key performance indicators like latency, jitter and reliability play a critical role to guarantee the sense of reality, and interactivity necessary for the highest user experience.

  • High-resolution video is necessary for immersive user experience. Today’s 4K video (4096 x 2160 pixels) requires 20 to 40 Mbps throughput and 8K (7680 x 4320 pixels) content (four times the resolution of 4K) will require 80 to over 100 Mbps.
  • Video resolution is not the only driver for higher throughput, other parameters like higher frame rate (> 90 FPS), High Dynamic Range (HDR), stereoscopic video (separate video stream for left and right eye) and 6 Degrees of Freedom (DoF) motion drive these requirements even further. It is expected that 200 Mbps to 5 Gbps (2) are necessary for a rich immersive video experience.
  • The data rates requirements are even higher for uncompressed/raw video or lightly compressed video (> 30 Gbps). Uncompressed or lightly compressed content provides reduced processing latency at the end points but at the cost of higher data rates needed for transport.

Figure 1: End to end latency

  • Low latency is critical for interactions during game play, or for mission critical applications like industrial robotics operations or medical procedures. This type of application requires real-time exchanges where the data provided in response to the user’s action must be delivered immediately.  The system’s feedback needs to be optimized for human reaction time depending on the senses that are used. For example, the Motion to Photon (MTP) latency between head motion and display should remain < 20 ms (2) for user comfort. The delay on the air interface is only a part of the end-to-end latency and the requirements will be driven by the type of application.

A look at Wi-Fi’s speed evolution

The IEEE’s 802.11be target performance of reaching a throughput of at least 30 Gbps may seem ambitious but looking back at Wi-Fi’s generational evolution of the past 20 years, the most striking progress is the pace at which the top data rates have increased. Thanks to a mix of technological advances and breakthrough new features, each generation has delivered higher performance than its predecessor.

Figure 2: Wi-Fi generational speed evolution

The current generation of Wi-Fi 6 devices is based on the 802.11ax standard. At its inception, in 2014, the High Efficiency WLAN (HEW) 802.11ax standard was directed at improving spectrum efficiency and performance in dense networks.  This goal was a departure from previous Wi-Fi generations that had mostly focused on throughput enhancements. Via innovative features most notably Uplink and Downlink OFDMA and MU-MIMO, in dense environments Wi-Fi 6 devices can achieve an average per-user throughput improvement of 4 times over Wi-Fi 5.

Furthermore, with the expansion of Wi-Fi in the 6 GHz band, Wi-Fi 6E devices can now benefit of up to 1200 MHz of uncongested spectrum. This additional spectrum opens the door to numerous new channels and enough contiguous spectrum to support the creation of extra wide 320 MHz channels. This is why, with the 802.11be standard, the focus is once again on delivering even higher throughput for Wi-Fi users with top speed up to 4.8 times faster than Wi-Fi 6/6E.

Figure 3: WiFi 6/6E and Wi-Fi 7 Speed Comparison for 2 Spatial Stream devices.

Wi-Fi 7 can reach a top link speed of 46 Gbps with 16 spatial streams and maximum 320 MHz channel bandwidth. For typical clients with 2 spatial streams, the highest data rate is around 5.8 Gbps on a single link, 2.5 times higher than similar 802.11ax clients.

What powers the Wi-Fi speed evolution?

Wi-Fi’s link speed evolution is powered by PHY technological enhancements along 3 dimensions:

  • Modulation Order (QAM)
  • Channel Size
  • Number of Spatial Streams

Figure 4: Wi-Fi PHY generational evolution

Modulation Order

Wi-Fi uses Quadrature Amplitude Modulation (QAM) for efficient data encoding. With each new Wi-Fi generation, higher data rates are achieved by encoding more data bits per symbol. Higher-order QAM enables to transmit more data bits while maintaining the same spectrum footprint and therefore achieve better efficiency.

  • 802.11ac (Wi-Fi 5) supports up to 256-QAM where 8 bits of data are encoded per symbol
  • 802.11ax (Wi-Fi 6/6E) supports up to 1024-QAM, where 10 bits of data are encoded per symbol
  • 802.11be introduces 4096-QAM modulation with 12 bits of data per symbol

4096-QAM leads to an increased data rate of 20% over 1024-QAM.

Channel Size

By doubling Wi-Fi’s channel bandwidth, nearly twice the amount of data can be carried in a single transmission. The 802.11be standard defines operation in the 6 GHz band, with a channel plan currently defined for up to 6 overlapping 320 MHz channels.  The availability of 1200 MHz contiguous spectrum in the 6 GHz band has opened the door to doubling the channel bandwidth from 160 MHz (maximum bandwidth in 802.11ax) to 320 MHz channels, hence doubling the maximum throughput. Availability of the 6 GHz band is however subject to regulatory approval and not all worldwide regions can enjoy the same amount of spectrum. The 5 GHz and 2.4 GHz bands cannot support 320 MHz channels, therefore this Wi-Fi 7 feature will be accessible to only a subset of users.

Spatial Streams

Spatial streams increase the system’s throughput by transmitting independent data streams on multiple antennas simultaneously. Therefore, the maximum throughput of a system with 8 spatial streams is 8 times that of a system with a single antenna. The 802.11ax standard defined MIMO support up to 8 spatial streams and this capability is already supported by some AP chipsets. The 802.11be generation could support up to 16×16 in the future, and thus doubling the maximum throughput compared to 802.11ax. While the theoretical maximum throughput can only be achieved between devices with the same antenna count, the number of MIMO streams for Client Stations is typically limited to 2 or 3.  The high number of spatial streams is an important factor for increased spectral efficiency with the implementation of Multi-User-MIMO (MU-MIMO), another feature supported in the 802.11be standard.

Multi-Link Operation (MLO), a breakthrough feature defined in the 802.11be standard adds a new dimension to Wi-Fi’s link speed evolution diagram. 

Figure 5: 802.11be Multi-Link Operation

With a common MAC layer and separate PHY layers, Wi-Fi 7 Access Points and Client Stations are capable of transmitting and receiving simultaneously on multiple links. In the hypothetical case of an AP Multi-Link Device (MLD) operating at 320 MHz channel in the 6 GHz band and 160 MHz channel in the 5 GHz band connected to a dual-radio MLD Client (2×2 + 2×2), the throughput could reach up to 8.6 Gbps by having data transmitted simultaneously on both links and thereby reducing latency. By taking advantage of intelligent traffic scheduling and prioritization, MLO enables reduced latency(3) and jitter by prioritizing data transmission on links with the best RF conditions or improve reliability by duplicating data on multiple links. 


While Wi-Fi 6 and Wi-Fi 6E high devices are well positioned for the needs of today’s applications, by the time Wi-Fi 7 reaches maturity it is anticipated that the consumer, enterprise, and industrial spaces will be upended with new applications requiring content delivery with extremely high throughput, low latency, low jitter and high reliability. The 802.11be standard is being defined to serve these applications so that Wi-Fi 7 can become the main connectivity technology capable of delivering the highest user experience. Wi-Fi device’s RF PHY performance is the critical foundation that will unlock this potential.

For more information visit the IQxel-MX page to learn about LitePoint’s test solutions for 802.11be or watch a replay of our webinar on Wi-Fi 7.





Wi-Fi 6/6E is a significant evolution of the Wi-Fi standard that boosts throughput and reduces latency to make it better for high-capacity applications as well as for emerging applications such as AR/VR, ultra-high definition video streaming, 5G offload and others.

These changes and the new opportunities they open up were the focus of a recent webinar hosted by RCR Wireless’ Catherine Sbeglia featuring LitePoint’s Adam Smith and Kevin Robinson from the Wi-Fi Alliance.

Wi-Fi 6 shipments have taken off surpassing 50% market share in terms of shipments in three years – a feat that took Wi-Fi 5 four years to achieve.

Wi-Fi 6 has a fundamentally different way of operating, using new technologies including orthogonal frequency division, multiple access (OFDMA) channel access method and 1024 QAM modulation to get this performance. With Wi-Fi 6E, the technology pushes into the 6GHz frequency band for the first time.

All of this sets the stage for understanding the new technologies that will impact the test strategies used to ensure the quality of Wi-Fi 6/6E products. Here are a few key points brought up by Adam during the webinar about the technologies and capabilities involved with Wi-Fi 6 and beyond:

With Wi-Fi 6E, systems need to support very wide spectrum – the 6 GHz band adds a total of 1.2GHz of additional spectrum over the 2.4 and 5 GHz bands. The RF front end will not be able to do this with a single filter bank or amplifier. This results in more complex RF front ends.

The other big factor with the RF front end will be band separation as the possibility exists that in a tri-band router that is transmitting in the 5GHz band could desensitize the receiver in the 6 GHz band.

The switch to OFDMA allows multi-user operation, enabling multiple users to transmit/receive simultaneously, making more efficient use of the channel. In Wi-Fi 6/6E, OFDMA allows the router to divide a channel into subchannels that can each support a user. This will require devices to have low frequency errors so as to not override a neighboring subchannel. Tight timing is also important to enable simultaneous transmissions so that all the client station devices respond to the access point within their allotted times.

And if that isn’t enough, Wi-Fi 7 is in the standards process and is offering all the efficiency of Wi-Fi 6/6E with extremely high throughput including 4096 QAM, 16 user streams and 320 MHz channels.

The full webinar discussion of the changes in Wi-Fi – including questions and answers – is available for replay and can be found here: State of Wi-Fi 6/6E

New Wi-Fi standard will provide breakthrough speed and performance for video streaming, gaming, and virtual reality applications

SAN JOSE, California — Dec.15, 2021 – LitePoint, a leading provider of wireless test solutions, today announced it is collaborating with leading chipset maker MediaTek to deliver turnkey design validation test solutions for MediaTek’s Wi-Fi 6/6E Filogic chips. The collaboration will enable MediaTek customers to simplify and speed test results for this new family of high-performance connectivity chipsets.
Wi-Fi 6/6E provides breakthrough connectivity experiences delivering multigigabit, low latency Wi-Fi and reliable connectivity for applications like streaming, gaming, AR/VR and more.

Through this strategic collaboration, LitePoint worked with MediaTek to create a version of LitePoint IQfact+ test automation software that enables MediaTek customers to test their Wi-Fi 6/6E designs with a complete and easy to use test flow. IQfact+ software works with the LitePoint IQxel family of test systems to provide MediaTek customers with chipset-specific test development software that enables rapid design validation through volume manufacturing with minimal engineering effort.

“LitePoint continues to drive innovation in test solutions for Wi-Fi products and we are pleased to be working with MediaTek to enable their customers to bring Wi-Fi 6/6E devices to market,” said Anna Smith, Vice President Worldwide Sales, LitePoint. “IQfact+ and IQxel offer MediaTek’s customer base a complete and easy-to-use testing platform for rapid design validation.”

“As the market changes and more people are working and studying from home, many applications rely on stable networking service, like video conferencing and 4K streaming. MediaTek’s Filogic family delivers fast, reliable and always-on connected solutions to give consumers a smooth, fast and interactive connected experience with incredible video quality,” said Alan Hsu, Corporate Vice President & General Manager, Intelligent Connectivity at MediaTek. “LitePoint’s testing platform will help our customers quickly and easily validate our Wi-Fi 6/6E solutions to speed up the time to market for the next generation of devices.”

Technical Details
LitePoint’s IQxel fully integrated test RF PHY test platforms include signal generation and analysis for Wi-Fi connectivity, meeting the needs of product development and high-volume manufacturing. The IQxel product family includes support for WLAN IEEE 802.11a/b/g/n/ac/ax/be with a frequency range from 400 to 7300 MHz for validation in the 6 GHz band, as well as a full range of connectivity standards such as Bluetooth 5.x, Zigbee and Z-Wave
To learn more about LitePoint’s IQxel product family, please visit:

LitePoint’s Rex Chen is the author of this blog, which is based on his presentation at the RCR Wireless News Editorial Webinar titled Minding the mid-band: C Band and CBRS efforts.

Wireless spectrum has become one of the most important resources to stay connected because spectrum is the utility for all wireless communication. Mid-band spectrum, specifically in the 2 GHz – 4 GHz range, is playing an important role in 5G, both globally and in the U.S which include:

  • Citizens Broadband Radio Service (CBRS), 150 MHz of spectrum in the 3550 MHz to 3700 MHz band in the U.S. It previously was used by the U.S. government until the FCC identified it as spectrum that could be used for broadband services.
  • C band refers to frequencies between 4000 MHz and 8000 MHz that are now being used for satellite communications, RADAR systems and other unlicensed use case such as some Wi-Fi enabled devices.

In 2020, the U.S. C Band spectrum auction raised more than $81 billion, demonstrating the immense value and potential of mid-band spectrum for wireless communication. Each mobile network operator has a different mid-band strategy based on spectrum holdings – both in terms of how much spectrum they own and whether they need more low or high band frequencies.

Mid-band spectrum has the advantage of known transmission characteristics that is similar to existing 4G frequency bands. Mid-band frequencies are considered “goldilocks spectrum” because they have characteristics of good coverage (similar to low band frequencies), and fast data rates (similar to high-band frequencies). But there are challenges to these RF bands due to how the band has been divided for its incumbent users. MNOs then need to devise spectrum sharing and spectrum usage strategies.

Smartphone design using mid-band spectrum can support world phones that have global cellular coverage capabilities. In many instances, it has a contiguous spectrum, and the short wavelength allows innovations like MIMO between 8 and 16 spatial streams. This enables an ideal degree of spatial multiplexing, where operators can use beam forming techniques to target users spatially – increasing the capacity.

Why Mid-Band is so Valuable
Deployments for 5G mid-band is occurring across multiple countries including the U.S., Europe, Korea and China starting in the past year and a half. This trend will continue as MNOs start to work with higher frequency ranges of the spectrum.

Mid-band spectrum provides the opportunity for a contiguous spectrum with the potential for channels that are 200 to 400 MHz wide. This requirement presents new challenges for filters and other RF components and requirements that must be met by both mobile devices and base stations. LitePoint is seeing instance where in actual deployments, carriers aren’t using an entire capability because they’re still testing to make sure it works for across-the-board deployments.

Increase in remote work, video conferencing and other high bandwidth applications means that uplink capacity is becoming more important than ever. This in turn is leading carriers and chipset makers to re-think how they mix and match spectrum resources. There are inherently many use cases for 5G that may result in making deployment challenging – but the variety of device requirements has made mobile network carriers understand that mid-band is necessary as part of 5G growth moving forward.

To gain more insights on 5G mid-band status and wireless test challenges in this spectrum, as well as hear what other industry leaders have to say, watch this editorial webinar. You can access it on the LitePoint webinar page.

LitePoint’s Eve Danel is the author of this blog. In this post, you’ll learn about updates to the Bluetooth, Ultra-Wideband and Wi-Fi specifications that improve location capabilities.

Recent standards updates to Bluetooth, Ultra-Wideband and Wi-Fi have introduced new and more accurate location capabilities that allow wireless device designers to use these technologies to support new indoor location applications. In this blog post we will review these methods and see how they compare to provide improved capabilities.

These location capabilities can be used to find people or objects with high accuracy where GPS and other satellite technologies – which are primarily for outdoor applications – lack precision or are blocked by buildings or other obstructions. While other indoor positioning systems can detect the location of an object, their accuracy can be on the order of several meters – which means that they cannot be used in many applications where more precision is required.

Both Bluetooth and Wi-Fi have already been used in many deployments for indoor location. Previous versions of these technologies had location capabilities that were based primarily on receive signal strength indicator (RSSI). This method can determine the distance between the transmitter and the receiving point by measuring the received signal strength, and then perform positioning calculations based on the corresponding signal attenuation. The accuracy of the RSSI method is limited by several factors, primarily if there are obstacles between the transmitter and the receiver causing attenuation, the accuracy of measurements will be reduced. The signal strength measurements can also be degraded if they are not calibrated, making the location calculation inaccurate by a few dBs.

The RSSI method is also inherently not very secure. It can be subjected to a “man-in-the-middle” (MITM) attack. This happens when attackers amplify the transmission to make the transmitter appear closer than it really is.

The next generation of positioning technology aims to unlock an even higher level of security as well as accuracy down to the tens of centimeters. Micro-location opens up a new generation of use cases allowing users to interact very precisely with their environment and the objects around them.

Bluetooth Low Energy 5.1

The recent addition of the Bluetooth 5.1 standard adds several direction-finding enhancements, providing angle information to the incoming signal. The standard specifies two methods: angle of arrival (AOA) and angle of departure (AOD).

For direction finding, the Bluetooth 5.1 devices transmit specially formatted packets and Bluetooth 5.1 receivers use an antenna array with at least 2 antennas to compute the angle of incidence based on the phase difference between the antennas, the signal’s wavelength and the distance between the antennas.

Combining RSSI information with either AOA or AOD provides the ability to determine the device location with better accuracy.

Figure 1: Bluetooth Low Energy 5.1 Direction Finding

Ultra-Wideband (UWB)

Ultra-Wideband, or UWB, based on the IEEE 802.15.4z-2020  HRP ERDEV (Enhanced Ranging Devices) amendment provides high accuracy and secure positioning.

UWB doesn’t use signal strength to evaluate distance. Instead, it uses time of flight (ToF).

ToF measures the propagation time that it takes for the signal to go from the transmitter to the receiver. Signals travel at the speed of light, regardless of environment. Distance estimation based on timing is more robust and is not affected as much by the environment as the RSSI methods that were described earlier.

UWB is more robust to the man-in-the-middle attack described earlier because a signal cannot be intercepted and retransmitted without adding latency, in addition frames containing an STS (Scramble Timestamp Sequence) with AES-128 encryption provide an added layer of security.

Figure 2: UWB Time of Flight

Wi-Fi IEEE 802.11az

The IEEE 802.11az Next Generation Positioning (NGP) standard is nearing completion and it is building upon the  fine timing measurement (FTM) feature that was part of the IEEE 802.11mc standard release. Unlike the RSSI method using signal strength to evaluate distance, FTM uses round trip time information to estimate distance between Wi-Fi enabled stations and access points.

The 802.11az standard is designed to improve upon the FTM legacy with NGP which uses several enhancements that are aligned with the 802.11ax/ Wi-Fi 6 standard. The Wi-Fi Alliance is also working on the next generation Wi-Fi location R2 program based on NGP.

Figure 3: Wi-Fi Timing Measurement Round Trip Time

FTM uses Time-of-Flight to measure distance. The mechanism is very similar to UWB’s ToF described earlier. Wi-Fi client stations and access points exchange a series of messages with Time of Departure and Time of Arrival timestamps that allow to derive round-trip time. Round trip time mechanism doesn’t require the clocks on the devices to be synchronized.

802.11az is designed to improve upon the legacy FTM by utilizing the latest features in the 802.11ax standard.

  • For improved accuracy 802.11az is making use of wider channel bandwidth, up to 160 MHz for Wi-Fi 6 but this could be extended to 320 MHz for the 802.11be (Wi-Fi 7 generation) since wider bandwidth provides higher resolution. For better resilience to multi-path effects MIMO signal properties are being utilized.
  • For improved protocol efficiency 802.11az reuses the 802.11ax PHY Null Data Packet (NDP) frames that are already defined for beamforming sounding. The timestamps are based on the exchange of HE ranging null data packet (NDP) frames or HE trigger-based (TB) ranging NDP.
  • The Multi-User capabilities of Wi-Fi 6 are also advantageous to 802.11az, when using Trigger based ranging with Uplink and Downlink OFDMA, the AP can effectively get ranging information from multiple stations in a single transmit opportunity. This feature significantly reduces the overhead needed for exchanging ranging information and improves scalability to more stations.
  • 802.11az introduces PHY and MAC layer Security enhancements to prevent spoofing or eavesdropping on ranging messages.

Testing Concerns

What’s true of all these wireless location technologies is that careful design validation testing is required to ensure the best performance. LitePoint’s wireless test product portfolio has the right solution to enable these technologies.

The IQgig-UWB™ tester is a fully integrated, one-box test solution for physical-layer testing of UWB-enabled devices. It supports all the required transmitter receiver testing, but it also has specific mechanism to test UWB ranging with a precision trigger and response mechanism that allows calibration and verification of ToF and AoA measurements.

The IQxel-MW 7G is designed to meet the demands of Wi-Fi connectivity testing up to 7.3 GHz for Wi-Fi 6/6E. And supports Bluetooth device standards testing including Bluetooth LE 5.1.

To listen to my entire webinar on this topic, please go to the LitePoint webinar page.

LitePoint releases turnkey manufacturing test solution for Microchip’s next generation wireless ICs

SAN JOSE, California –  – LitePoint, a leading provider of wireless test solutions, today announced a collaboration with Microchip Technology Inc. to deliver simplified design validation and turnkey manufacturing test solutions for next-generation Internet of Things (IoT) systems, based on Microchip’s Bluetooth and Wi-Fi chipsets.

As part of the collaboration, LitePoint has released a version of its IQfact+TM test automation software tailored for Microchip’s new WFI32E01 series of Wi-Fi MCU modules.

LitePoint’s IQfact+ is a turnkey, chipset-specific test development software that enables rapid-volume manufacturing with minimal engineering effort. With IQfact+, wireless system developers get access to proven software that enables control of devices under test and the test system with calibration routines that are uniquely optimized to reduce time and maximize throughput.

Microchip Technology is a leading provider of Bluetooth and Wi-Fi ICs, modules and reference designs that ensure robust, reliable and safe connections. Microchip’s comprehensive wireless connectivity portfolio meets the range, data rate, interoperability, frequency and topology needs of many IoT applications.

“The rapid growth of IoT is driving the deployment of billions of wirelessly-connected products, making the wireless connection quality of these devices paramount,” said Adam Smith, Director of Product Marketing at LitePoint. “We’re excited to work with Microchip to bring a simplified wireless test solution that will ensure their customers are able setup and test the wireless quality of Microchip-based devices in a matter of hours, compared to traditional test development, where set up and deployment can take weeks, or even months.”

“LitePoint provides a wireless test system that offers significant time-to-market advantages which are critical to our customers who want to integrate our chip-down reference designs into their applications,” said Steve Caldwell, VP of Microchip’s wireless solutions group. “Combining our innovation with LitePoint’s high volume production test technology provides our customers the tools they need to meet their customer’s demand for increasingly complex products, delivered at the highest quality.”

Technical Details
LitePoint’s IQxel flly integrated test platforms include signal generation and analysis for Wi-Fi connectivity, meeting the needs of product development and high-volume manufacturing. The IQxel product family includes support for WLAN IEEE 802.11a/b/g/n/ac/ax/be, as well as a full range of connectivity standards (Bluetooth 5.x, Zigbee, Z-Wave), with a frequency range from 400 to 6000 MHz and upgradable to support up to 7300 MHz for 6 GHz band.

For faster time to market, turnkey IQfact+ software solutions offer customized testing of leading chipsets and enable thorough design verification and rapid volume manufacturing with minimal engineering effort.
All trademarks are the property of their respective companies.

Testing UWB’s New Features

Wireless technologies that enable new micro-location capabilities are expanding use-cases of applications, including those that demand a higher level of security. In the automotive industry, for example, there is a demand for a more secure authentication between the car and the key fob. This has created a perfectly-timed opportunity for ultra-wideband (UWB) technology to flex its unique advantages in localization, position detection and direction, compared to other technologies that make it a good fit for the increased security needed. UWB is ideal for applications that enable two devices to securely find each other and understand where they are relative to one another in space.

UWB uses a technique called time of flight (TOF) to enable its secure method of authentication. Unlike other technologies, which estimate the distance separating two devices by signal strength, TOF involves sending and receiving signals and calculating distance based on the time it takes for a complete send/receive cycle. It is relatively easy to hack signal-strength measurements, but time is nearly impossible to fake. This high level of security makes UWB relevant today.

The three primary use cases for UWB are:

Access control: Embedding UWB into a device such as a key fob allows tight access control, whether access to a secure building or car. A UWB-based system can estimate proximity within 10 centimeters. The access system can be set to allow access only when an authorized user is within a given distance and anyone not carrying the UWB device will be denied entry automatically.

Location-based services: The precise proximity detection enables location-based services such as navigation for indoor spaces and contextual content based on the user’s location. As you move from room to room, the system provides information relevant to each room. Think of a museum, or an office complex, where there is information readily available specific to a certain location within the facility.

Device-to-device communication: UWB allows devices to securely share information, and this can be combined with its localization capabilities to push relevant information from one device to another.

UWB Ranging: A Closer Look
UWB uses electronic pulses that are short in time and wide in frequency—hence the name ultra-wideband. Essentially, one UWB devices emits a signal, and a device that detects the signal sends back a response. This is called either single-sided ranging or single-sided two-way ranging. By measuring the time between sending the signal (called a ping) and receiving the reply (pong), the sending device can calculate the distance separating them with precision.

In some cases, the sending device will send another signal upon receiving the reply, which allows both devices to measure the time. This technique, called double-sided two-way ranging adds a second layer of security and accuracy to the positioning.

Crucial UWB Quality Tests

When evaluating UWB device performance, there are three areas where measurements are important.

Regulatory: UWB signals overlap licensed spectrum, so it is important to validate that the device is not running afoul of regulations established by the country’s spectrum governing bodies. While the absolute transmitter power and power spectrum mask are the main regulatory metrics, other factors in the device’s performance are indicative of whether or not the device will meet the power mask requirements.  This includes transmitter calibration and testing pulse shape. The more power the device emits, the more energy will splatter out of the desired channel and potentially interfere with neighboring devices, or worse, violating licensed frequency bands. Mask refers to the device’s ability to prevent that signal bleed, due to either spurious or linearity of the transmitter, while the power output impacts the operating range of the end-device. Good calibration maximizes power while delivering valid spectral mask performance.

Interoperability: A key metric for UWB interoperability is frequency accuracy. Wireless devices use tuning capacitors to adjust the crystal that determines the broadcast frequency. If this tuning isn’t done properly, the result is carrier frequency offset (CFO), or frequency error. Put simply, this means the device isn’t exactly operating at the same frequency as the other devices. If the CFO is small, the device will be able to communicate, but ranging measurements will have error. If CFO is large, the badly tuned device might not communicate with others at all. Frequency calibration is a key part of ensuring device interoperability. Another tool is a measurement called the normalized mean square error (NMSE), which captures several different effects in the system and provides a summary measurement.

Time of flight: There are two schools of thought about calibrating TOF. One is to calibrate the system at the printed circuit board level and assume that the over-the-air performance with the antenna attached will be within a narrow tolerance range. The other approach is to test the assembled device and calibrate TOF using its actual over-the-air performance. Either approach is effective, but the key takeaway here is that there are many different components and sources of variation in the system. The goal is to minimize variation and achieve the positional accuracy that UWB can provide.

The Angle of Arrival: What it is and How it Works
UWB can work in two modes, triangulation and peer-to-peer. Triangulation requires a series of “anchor” devices placed in the environment that a UWB “tag” device can use to compare its relative position to the multiple anchors.  Peer-to-peer operation allows two devices to communicate directly without the need for anchor devices in the environment. Peer-to-peer operation requires that the device can determine both distance and direction.  Distance is achieved from TOF, direction is determined by Angle-of-Arrival (AoA).  AoA requires that the device has at least two antennas. AoA determines the time (or phase) difference between the different antennas as they detect an incoming signal. This time (or phase) difference provides information that enables the device to calculate the angle from which the signal is coming.

The FiRa Consortium
The FiRa Consortium, founded in 2019, is an organization seeking to bring together industry and product experts and develop an ecosystem in which UWB products are interoperable and work seamlessly with each other. The membership has grown dramatically in the past year. The consortium will be launching a certification program that will ensure devices meet FiRa specifications for interoperability. The certification program will cover conformance test cases for both physical layer functions (PHY) and medium access control (MAC).

Certification means the device has been tested at an independent authorized testing lab (ATL) and found, after rigorous testing, to conform to FiRa specifications. That provides manufactures and users a high degree of confidence that the devices will be interoperable with other devices.

LitePoint Solutions
LitePoint’s UWB testing system includes both hardware and software that enables pre-programmed testing scripts to maximize the testing of a particular chip. LitePoint’s IQgig-UWB is a fully-integrated, FiRa-validated system, designed for UWB PHY layer testing. Turnkey test solutions for UWB chipset and device manufacturers are available, including a FiRa PHY Conformance automation solution. IQgig-UWB is ideal for lab or high-volume manufacturing environments, for UWB-enabled modules and end-products and for UWB-enabled automotive, mobile, asset tracking or healthcare devices.

IQfact+, developed in close collaboration with leading UWB chipset vendors, is a software-based turnkey calibration solution for chipset manufacturers. It supports all of the key wireless connectivity technologies, including UWB.

UWB technology offers powerful capabilities for a wide range of settings where access control, location-based services and device-to-device communications are important. However, actual performance achieved in the end- application depends on ensuring that the key RF parameters meet a high degree of accuracy.  These key measurement parameters seek to ensure the devices meet regulatory guidelines and interoperability requirements as well as deliver a positive user experience. From a test point of view calibrating a UWB device’s frequency and transmitter power are crucial. Precise time-of-flight and angle-of-arrival measurements are needed to validate the centimeter-level accuracy that UWB devices can provide. LitePoint’s solutions provide that testing, helping to unlock UWB’s power.

LitePoint’s Eve Danel is the author of this blog. In this post, you’ll learn about the latest Bluetooth specifications and new test requirements for Bluetooth 5.1.

In 2010, Bluetooth Low Energy (Bluetooth LE) was introduced with the Bluetooth 4.0 standard. Compared to previous generations of Bluetooth, which were known as classic, Bluetooth LE operates at lower power, increasing the battery life of host devices. In 2016, the Bluetooth 5 standard was introduced, doubling the data rate and increasing the operation range. Bluetooth 5.1 was released in 2019, adding direction-finding enhancements that enable location tracking applications.

Bluetooth 5.1 Use Cases

With the direction-finding enhancements in Bluetooth 5.1, the technology is able to provide GPS-like positioning capabilities but operating indoors. The use cases for this type of technology range from asset tracking to locating people in buildings, indoor direction finding, object tracking, among others. And this wide range of use cases can be applied in nearly every market vertical including industrial, medical, retail, hospitality, enterprise, home or transportation.

RSSI-Based Proximity Sensing

Bluetooth is already a popular technology for proximity sensing using Bluetooth beacons. In this method, the locator estimates distance using the received signal strength indicator (RSSI) value, which represents the strength of the signal that is measured on the locator’s receiver. RSSI is used to estimate the distance between a transmitter and a receiver. Knowing the transmitter’s power, it’s possible to estimate the location of an item by measuring signal attenuation.

Figure 1: RSSI based method

RSSI only provides a crude estimation of distance because this method is biased by the environment. Obstacles between the transmitter and receiver that provides attenuation – a crowd of people for example can add significant attenuation to the signal and therefore decrease the accuracy of the distance estimate. In addition, the RSSI method can only detect that the device is located somewhere in a circular zone around the receiver as shown in Figure 1 above as it doesn’t provide information about the position.  By deploying multiple locators and using signal trilateration, it is possible to locate the device with more accuracy, but it also increases the complexity of the system.

Increasing Accuracy with Angle of Arrival and Angle of Departure

Bluetooth 5.1 increases the accuracy of the RSSI method by providing angle of arrival (AoA) and angle of departure (AoD) information.

AoA is designed for use in applications like asset tracking, where a moving AoA transmitter (a mobile phone for example) is sending Bluetooth LE direction-finding signals using a single antenna, and a fixed AoA receiver (for example installed on the ceiling) is equipped with an antenna array with a minimum of two antennas. The array is used to determine the direction of the transmitter by using the phase differences between the antenna. The angle of arrival determination is based on the signal’s wavelength, the distance between antenna and the phase of the received signal.

Figure 2: AoA method

AoD is designed for use in applications like wayfinding for indoor navigation. Where the AoD receiver (for example a phone) receives the direction-finding signal transmitted by the fixed AoD transmitter (for example installed on the ceiling) that is equipped with an antenna array with a minimum of two antennas. Just like AOA, the angle is determined by using the signal’s phase information.

Figure 3: AoD method

Bluetooth 5.1 Constant Tone Extension (CTE)

To enable AoA and AoD, the signal’s phase must be measured. For direction finding, Bluetooth 5.1 core specification adds a constant tone extension (CTE) field that is a sequence of bits with a variable duration from 16 microseconds to 160 microseconds. CTE is only supported for Bluetooth LE operating at rates of 1 Mbps, which is mandatory, or, optionally, 2 Mbps. The CTE field contains a series of modulated 1 bits which must be transmitted at one frequency with a constant wavelength in order to measure the phase of the receive signal. Thus, the signal is not subject to whitening, a process that scrambles signals to ensure there will be no long strings of 1s or 0s.

Figure 4: CTE

How is Bluetooth 5.1 Tested?

The key to a successful technology introduction for location services is how accurately the location can be measured, BT 5.1 aims to provide a sub one-meter level of accuracy. Validation of the solution plays an important role in ensuring the system’s performance.

The Bluetooth SIG has updated the Bluetooth PHY test specifications to include validation of these new direction-finding capabilities. New test cases were added for transmitter and receiver tests for AoA and AoD methods, and the various combinations of Bluetooth data rates that are supported in 1 Mbps and 2 Mbps, as well as the 1 microsecond and 2 microsecond switching and sampling times that are supported in the standard. Overall, there are 23 new test cases that were added to the PHY tests to cover AoA and AoD.

CTE is a new concept for Bluetooth and the test cases are designed to ensure that it can be properly generated by the transmitter. On the receiver side, it’s important to ensure that the IQ measurements on the receiver CTE can be used to accurately derive the phase of the signal.

Figure 5: Complete AoA and AoD test setup

Figure 5 above shows a complete test setup for AoA and AoD transmitter and receiver testing. LitePoint’s IQxel-MW 7G supports the new direction-finding test cases. LitePoint also offers IQfact+ software packages that provide completely automated DUT and tester control for leading Bluetooth 5.1 chipsets.

AoA Transmitter and Receiver Testing

In an AoA transmitter (typically a mobile device), the device has only a single antenna and the CTE field is transmitted continuously without switching at the end of the PDUs. On the received signal, the tester verifies the maximum peak and average power of the CTE signal as well as the carrier frequency offset and the carrier drift of these transmitted signals.

The AoA receiver is more complex because it has multiple antennas and a switch, so when receiving the CTE signal transmitted by the tester, the receiver will switch between the multiple antennas in the array according to a predetermined pattern. The time of the switching is either 1 microsecond or 2 microseconds as defined in the standard. The DUT also samples the received CTE during the allocated sampling slot’s timing of 1 microsecond or 2 microseconds as defined in the standard. The IQ samples taken by the DUT are transmitted to the tester for analysis to ensure that the phase measurements are within specs.

AoD Receiver and Transmitter Testing

An AoD receiver (mobile device with single antenna) needs to sample the received CTE signal at the correct sampling slots. To verify this, the IQ samples that are taken by the DUT are transmitted to the tester for processing and analysis. Just like with the AoA receiver test, this test verifies that the phase values derived from the sampling are within specs.

AoD transmitter devices have an antenna array with a minimum of 2 antennas, therefore the transmitter test verifies that the antenna switching occurs during the proper allotted slots of the CTE and the proper switching pattern within antenna array.

Please visit the IQxel-MW 7G and IQfact+ pages for more information on LitePoint’s test solutions for Bluetooth 5.1.

For more information download our application note on Testing BT 5.1 or download the replay of our webinar on this topic.

LitePoint’s Eve Danel is the author of this two-part blog series. Throughout these posts, you’ll learn about the key RF-PHY changes made in Wi-Fi 6/802.11ax compared to Wi-Fi 5 and older generations: the addition of a new frequency band, higher modulation rates, smaller subcarrier spacing, longer guard intervals and the introduction of the OFDMA modulation technique providing capacity improvements and latency reduction.

In my previous blog post, I explored some of the key RF-PHY changes in Wi-Fi 6, including a fundamental change to the way that Wi-Fi 6 operates, which is using orthogonal frequency division multiple access (OFDMA). Let’s take a closer look at Wi-Fi 6 OFDMA.

Resource Units (RUs)

With OFDMA, the channel bandwidth is divided into resource units (RUs) of various sizes. The size of an RU can vary from the smallest 26 subcarriers (or 2 MHz) up to 996 tones (or 77.8 MHz). The size and the location of the RUs is defined for 20 MHz, 40 MHz, 80 MHz channels, and 80+80 or 160 MHz channels.

Figure 1: RU location in 80 MHz channel

RUs are comprised of:

  • Data subcarriers used to carry data information and form the majority of the subcarrier’s assignment
  • Pilot subcarriers used for phase tracking for channel estimation
  • DC subcarriers at the center frequency of the channel
  • Guard band/Null subcarriers used at the band edges to protect from interference from neighboring RUs

Different numbers and sizes of RUs can be allocated for transmissions to different users, based on how much data each station needs. The access point (AP) is responsible for RU assignment and coordination. For example, applications that require a lot of data, like streaming video, can be assigned a large RU, while applications that require very little data can be assigned a small RU. Each RU can use a different modulation scheme, coding rate and level, and RU assignments can vary on a frame by frame basis.

Figure 2: OFDMA transmission showing 80 MHz channel divided into 8 RUs

For a downlink transmission, an HE MU PPDU carrying a mixture of 26-, 52-, 106-, 242-, 484-, and 996-tone RUs is transmitted from the AP to the stations. The HE MU PPDU preamble is transmitted over the whole channel and contains a new field called the SIG-B that provides information to the stations about their RU size and frequency allocation, the modulation MCS and the number of spatial streams allocated by the AP.

Figure 3: HE MU PPDU

The HE-SIG-B field is only included in HE MU PPDU (downlink) and it includes the information for OFDMA and MU-MIMO resource allocation. The HE-SIG-B field includes a common field followed by user specific fields.

The common field of an HE-SIG-B content channel contains information regarding the RU assignment, the RUs allocated for MU-MIMO and the number of users in MU-MIMO allocations. The SIG-B user specific fields contain information for all users in the PPDU on how to decode their payload.

Wi-Fi 6 OFDMA creates a large number of permutations of possible RU sizes, frequency allocations and power levels that need to be validated to ensure the transmitter and receiver performance meets the standard criteria. All of these permutations can have different error vector magnitude (EVM) and spectral performance.

Wi-Fi 6 OFDMA uplink transmissions are even more complex than downlink operation, because in the uplink, the traffic must be transmitted simultaneously from multiple stations to the AP. In the uplink transmission, the AP acts as an operations and transmission coordinator.

First, the AP sends a trigger frame to all the stations that will be involved in the upcoming transmission, and then these stations transmit simultaneously on their respective RUs in response to the trigger frames. Based on the trigger frame, the client station will need to tune its timing, frequency and power levels to participate in this transmission.

Timing synchronization

Client stations participating in an OFDMA transmission must transmit within 400 ns of each other. In order to synchronize the clients, the AP transmits a trigger frame. This frame contains information about the OFDMA sub-carrier’s RU assigned to each station. In response, the participating clients need to start transmission of the uplink signal after a specified time interval short inter-frame space (SIFS) of 16 µs +/- 400 ns after the end of the trigger frame as mandated by the 802.11ax standard.

Figure 4: Timing synchronization of uplink OFDMA

Frequency synchronization

To prevent inter-carrier interference (ICI) between the clients transmitting simultaneously, all stations participating in the transmission need to pre-compensate for carrier frequency offset (CFO). The client stations adjust their carrier frequency based on the trigger frame received from the AP. The 802.11ax standard requires the residual CFO error after compensation to be less than 350 Hz.

Figure 5: Wi-Fi OFDMA frequency synchronization

Power control

When traffic is transmitted between the AP and client stations that are located at various distances, power control is needed to ensure that stations closer to the AP do not drown out users farther from the AP that are transmitting simultaneously.  The 802.11ax standard requires the stations to adjust their power based on the estimated path loss to the AP. Devices closer to the AP transmit less power while devices farther away transmit more power to achieve the same received power at AP receiver considering the path loss. There are two classes of devices defined in the standard based on how accurately they can control their power. Class A devices control their transmit power within ±3 dB and class B devices control their power within ±9 dB.

Figure 6: Wi-Fi OFDMA power control

The accuracy of the synchronization between the AP and the client is critical because a single bad actor (station) that doesn’t perform as expected will degrade performance for all other users that are sharing the same transmission.

Unused Tone EVM

Another concept introduced in Wi-Fi 6 OFDMA is the unused tone EVM metric. As discussed above, when the stations transmit in the uplink direction on their assigned RU, it is important that emissions do not spill over into other RUs, otherwise that will reduce the system capacity for other users.

Figure 7: Unused tone EVM

To evaluate the station’s performance, the 802.11ax standard introduced this new metric, which is a measure of the in-channel emissions generated by the stations. IEEE specifies RU leakage as a staircase mask requirement, which is called unused tone EVM. The unused tone EVM measures the EVM floor of 26 tone RU over the whole channel bandwidth excluding the position(s) for the active RU.

The validation of uplink multi-user transmission in Wi-Fi 6 OFDMA is one of the most challenging areas of 802.11ax testing. For the tester, it requires the involvement of both the signal generation and signal analysis function, which need to be coordinated to run a test sequence to provide nanosecond-level accuracy measurements to ensure the required 400 nanosecond accuracy of the uplink transmission.

Figure 8: Wi-Fi 6 trigger-based testing

When testing an AP design, the tester emulates a station, and it decodes the received trigger frame information in order to generate a trigger-based physical layer protocol data unit (PPDU) with the correct RU allocation. The tester also needs to adjust its power to the AP requirements in real time because it needs to respond to the AP with a nanosecond level of accuracy.

When testing a client station, the tester emulates an AP, generates a trigger frame and will measure the response with the signal analysis function. The tester measures the trigger-based PPDU generated by the station itself. The tester needs to measure the timing, frequency and power accuracy of this transmission from the client to ensure it meets the 802.11ax standard requirements and the RU leakage requirements.


Beyond testing the traditional transmit and receiver metrics that are necessary for any Wi-Fi generation, the chart below shows an overview of the new test areas that are specific to 802.11ax. These test sequences for Wi-Fi 6 OFDMA require a higher level of test complexity compared to previous Wi-Fi generations.

Figure 9: Wi-Fi 6 key test areas

If the number of test cases and complexity of Wi-Fi 6 OFDMA seems overwhelming, I encourage you to explore LitePoint’s IQfact+ automation solution. The software is tailored to control both the tester and the device under test (DUT) and it provides calibration and verification automation. LitePoint already offers IQfact+ packages for Wi-Fi 6 and Wi-Fi 6E chips for AP and client devices.