Breaking News

Wi-Fi news from LitePoint

As one of this century’s seminal communications technologies, Wi-Fi has steadily improved its performance over the past 25 years primarily by focusing on one key metric: speed. Expanded frequency bands, wider channels, higher-order modulation, more spatial streams – these advances were all designed to push peak throughput ever higher. Wi-Fi 8 is a decisive break from that pattern.

This was the conclusion shared by wireless technology experts during a recent RCR Wireless News Wi-Fi Forum I joined to explore the future of the evolving communications protocol. The panel – Wi-Fi 8 on the horizon: Will it be the connectivity fabric for the AI era?, was moderated by Rosalind Craven, Principal Analyst with Telecoms Consultancy, STL Partners, and featured Andy Davidson, Senior Director of Technology and Planning at Qualcomm; Spirent’s Principal Product Manager, Janne Linkola; Marcus Brunner, ISG F5G Vice-Chair for the European Telecoms Standards Body, ETSI; and myself, Product Marketing Manager at LitePoint.

As the panelists emphasized, Wi-Fi 8 is not about chasing data rates, its goal is to ensure wireless connections are predictable, resilient and deliver a consistently positive user experience even in the most challenging environments.

From “How Fast?” to “How Reliable?”

The motivation driving Wi-Fi 8 reflects how wireless networks are actually employed. For home, enterprise and industrial users, peak speeds are bounded by how they hold up in the real world, where dozens – or even thousands – of devices compete for airtime and require equal weight be paid to throughput and latency.

 

This includes more robust handling of interference, improved coordination across access points and better consistency at the edge of the cellular network. These performance imperatives address the frustrations users encounter every day: dropped calls, frozen video and unpredictable roaming behavior.

The panel reached a consensus that Wi-Fi 8’s value is most impactful when applied to high-stress use cases. These include enterprise and campus networks in addition to densely populated public venues like airports, stadiums and convention centers. In these cases, reliable roaming, congestion management and improved coexistence are essential for consistent performance. Smart home devices, immersive XR, spatial computing and cloud gaming platforms are another target category where low latency defines the user experience, while industrial automation applications demand deterministic performance and reliability to manage communications links for robotics, monitoring and safety-critical systems.

Standardization Timing and Top Challenges

As for timing, the panel aligned on a familiar pattern. Wi-Fi 8 is still in development within IEEE 802.11bn, with formal ratification expected later in the decade. Early implementations based on draft versions of the standard are likely to appear before then, followed by broader commercial adoption as certification programs mature.

This staged rollout mirrors previous Wi-Fi generations, but with one important difference: expectations are being managed more carefully. Rather than promising dramatic speed jumps, the industry is positioning Wi-Fi 8 as a foundation for long-term performance stability, especially in dense, high-reliability deployments.

Ultimately, timing will be influence by a series of recurring challenges that surfaced throughout the discussion:

Reliability and Interference: As wireless environments grow more crowded, interference is no longer an edge case, it’s the norm. Wi-Fi 8 aims to improve robustness in the presence of both Wi-Fi and non-Wi-Fi interferers, reducing performance cliffs and improving fairness across devices.

Congestion and Density: From apartment buildings to stadiums to campus-wide deployments, usage density stresses today’s networks. Wi-Fi 8 introduces mechanisms to better coordinate transmissions across access points and clients, improving aggregate performance and reducing contention.

Latency and Jitter: For applications like real-time collaboration, immersive media and industrial control, latency consistency often matters more than peak throughput. The panel highlighted key enablers for time-sensitive applications, including minimizing latency spikes and improving determinism.

Interoperability: With more complex features comes a greater risk of inconsistent implementations. Ensuring that devices from different vendors behave predictably and work together is a major focus of the standard.

Why Test Still Matters – More Than Ever

The panel reiterated that Wi-Fi 8’s improved reliability places a burden on validation in the form of wireless signal and radio testing. This remains essential to ensure that devices behave correctly across a wide range of real-world conditions, including interference, mobility, congestion and coexistence scenarios. Interoperability testing becomes especially critical as features grow more complex and user expectations rise.

 

At LitePoint, we believe Wi-Fi 8 is reinforcing the importance of comprehensive test strategies that extend beyond basic throughput measurements. Validating reliability, latency behavior and performance consistency across diverse scenarios is what ultimately ensures that the promise of Wi-Fi 8 pays off with a consistently excellent user experience.

More Than “Just Another Wi-Fi”

The panel’s conclusion was clear: Wi-Fi 8 is not defined by a single breakthrough feature. Its significance lies in a philosophical shift from maximizing peak performance to delivering dependable, high-quality connectivity where it matters most.

In that sense, Wi-Fi 8 represents a maturation of the technology. It acknowledges that wireless is no longer a convenience layer, but critical infrastructure. By prioritizing reliability, interoperability and experience, Wi-Fi 8 moves the industry closer to a future where wireless performance isn’t just something users can measure – it delivers an experience they can trust.

 

Bluetooth® and ultra-wideband (UWB) are often discussed in isolation rather than as parts of a coordinated system. That framing no longer reflects how real systems are being designed or deployed. In practice, Bluetooth Low Energy (LE) and UWB are increasingly architected to work together, each addressing different technical objectives. Automotive digital keys are the clearest example today, but the same complementary pattern is emerging across other proximity-based applications.

At the same time, both technologies are evolving rapidly. Bluetooth is adding more sophisticated ranging capabilities, while UWB continues to advance accuracy, security and performance features. For manufacturers, suppliers and designers in the automotive and other markets, this convergence leads to a conclusion: wireless test methodologies must evolve from assessing individual radios to validating complete, multi-radio systems.

Why Bluetooth LE and UWB Coexist in Digital Keys

Modern digital key architectures grew out of early Bluetooth LE-only implementations that relied heavily on a Received Signal Strength Indicator (RSSI). RSSI is a relative measure of received signal strength that is used to infer proximity to the signal source. While effective for discovery and convenience, RSSI-based distance estimation is inherently vulnerable to relay and man-in-the-middle attacks, making it unsuitable as the sole mechanism for security-critical access decisions.

This limitation drove the industry toward UWB-based secure ranging, formalized through initiatives such as the Car Connectivity Consortium (CCC). Today’s digital key designs intentionally divide responsibilities:

  • Bluetooth LE = Discovery and Intent. Vehicles identify authorized phones or key fobs using Bluetooth LE, then monitor approach behavior and user intent over a relatively long range with very low power consumption.
  • UWB = Secure Localization. Once the user enters a defined proximity zone, UWB performs precise, secure distance measurements before authorizing unlock.

This handoff is foundational. Bluetooth LE acts as the always-on, low-power sentinel. UWB acts as the high-confidence decision engine.

When Accuracy and Security Matter, UWB Is Essential

UWB’s advantage lies in how it measures distance. Instead of inferring proximity from signal strength, UWB uses time-of-flight techniques that deliver centimeter-level accuracy under typical operating conditions. Security is reinforced at the physical layer through mechanisms such as Scrambled Timestamp Sequence (STS), which significantly raises the bar against relay and replay attacks.

Equally important, Bluetooth LE and UWB do not compete for spectrum. Bluetooth LE operates in the 2.4 GHz band, while UWB occupies much higher frequencies, typically in the 6- to 10-GHz range depending on region. This allows both radios to operate concurrently without RF interference.

Bluetooth Is Evolving and Strengthening Coexistence

Bluetooth LE itself is not standing still. Technologies such as Bluetooth Channel Sounding are improving ranging accuracy by combining phase-based ranging and round-trip time measurements.

These advances do not replace UWB; they make coexistence more efficient. Improved Bluetooth LE ranging enables better intent detection and helps systems decide when to wake UWB, reducing unnecessary power consumption while maintaining responsiveness. In effect, improvements in Bluetooth enhance the effectiveness of UWB-based systems.

Automotive As a Proof Point

Automotive remains the most visible example of this complementary design approach because the consequences of failure are so high. Unlocking must be seamless, secure and repeatable across devices, environments and regions.

Orange: UWB, Blue: Bluetooth

At the same time, UWB adoption is expanding beyond access control. Applications such as indoor navigation, asset tracking and tap-free interactions are gaining traction. In vehicles, UWB-based sensing use cases, including child presence detection, intrusion alerts, and gesture or kick sensing, are already appearing, often through proprietary implementations. Over time, many of these sensing applications are expected to converge toward standardized approaches, increasing the importance of adaptable test strategies.

The Real World Is Not a Clean RF Lab

Even well-designed systems can fail if they’re only validated under ideal conditions. In practice, RF performance is affected by multipath, orientation and human factors. A phone in a back pocket, for example, can experience significant signal absorption by the human body, altering RSSI readings and delaying discovery or intent detection.

This variability reinforces a critical lesson: passing standards-based RF tests is not the same as delivering a reliable user experience. Over-the-air (OTA) test and system-level validation are essential, particularly for proximity-based applications where timing and behavior matter as much as raw RF performance.

What Changes for Test When Bluetooth LE and UWB Form a System

When Bluetooth LE and UWB are combined, test requirements expand beyond single-radio validation. During design and validation, teams must verify Bluetooth LE discovery and intent detection under realistic attenuation and orientation, as well as testing UWB ranging accuracy, timing determinism and secure ranging behavior. They also need to confirm the correct Bluetooth LE-to-UWB handoff across defined proximity zones.

In manufacturing, production test must adapt accordingly. Bluetooth LE still requires RF parametric checks such as transmit power verification and RSSI calibration. UWB introduces additional requirements, including power calibration to meet regulatory limits and antenna delay compensation to enable accurate time-of-flight measurements. Timing synchronization and intent verification increasingly become part of end-of-line testing.

Beyond the factory, field testing and regression validation – often linked to OTA software updates – play a growing role. CCC certification processes already emphasize real-world scenario testing, where OEMs and device makers jointly validate lock and unlock behavior. These results feed regression testing to ensure future updates do not introduce new failure modes.

Implications for Wireless Test Platforms

The message across all stages is clear: test must evolve with system complexity. As Bluetooth LE and UWB continue to advance through improved ranging, higher data rates and extended range, test platforms must adapt without forcing redesigns at every technology transition.

For LitePoint, this means enabling customers to validate Bluetooth LE and UWB together, over the air, with the timing, phase, power and security measurements these systems demand. The goal is not simply to test radios, but to verify complete proximity systems – from discovery to secure action. UWB and Bluetooth are not rivals. They are complementary technologies whose combined strengths enable secure, intuitive user experiences. Automotive shows how this coexistence works today – and why modern wireless test must evolve alongside it.

As we plunge into the new year, LitePoint scouted three wireless trends in various stages of maturity: the rise of AI wearables, the awaited arrival of Wi-Fi 8 silicon, and early spadework required to seed 6G cellular networks. In their own way, each is enabling the future of AI communications by moving actionable intelligence closer to the user, shoring up wireless data security and privacy, and expanding global access to a new swath of frequency bands.

Here’s how we see these forces shaping up over the next 12 months, and why, for wireless design and test engineers, the coming year is more about preparation than promotion.

AI Wearables Trend from Geek to Chic

AI wearables are a relatively small but high-growth segment. Grand View Research found that the on-device AI category – where tasks are run locally without heavy cloud support – accounted for about 10 percent of the $84 billion wearables market in 2024, the last period for which full-year data is available. Look for acceleration this year as on-device AI rides a 27.8 percent CAGR wave through 2030 – twice that of the total wearables segment.

The growing interest in AI wearables isn’t due to any breakthrough in bleeding-edge wireless technology. Rather than demanding more gigabits per second, consumers want lower latency, hands-free content capture, intelligent personal assistance, detailed health insights and enhanced security.

These features define an expanding universe of AI wearables that includes smart watches, rings, fitness trackers and eyewear. In a sign that form is now driving function, AI devices not only need to operate for days on small batteries they have to fit comfortably on the face or body and be stylish enough to wear in public.

Where AI performance meets sophisticated design, expect wireless engineers to open new frontiers for the efficient integration of radios, sensors, processors and memory in millimeter-sized packages. For test, AI wearables create challenges both familiar and new:

  • Highly integrated multi-radio design (Bluetooth, Wi-Fi, UWB, cellular) in tight spaces
  • Aggressive power budgets that demand careful RF optimization
  • On-body propagation effects that don’t exist in other electronic designs
  • New durability concerns: AI-enabled garments must weather everyday use, including washing cycles, without RF signal degradation.

Ensuring reliability requires test methodologies that replicate true on-body conditions, validate coexistence under load and measure the performance impact of AI workloads on battery life and RF behavior. The rise of AI wearables isn’t just a new device trend; it’s a new paradigm for wireless validation.

Wi-Fi 8: The Quiet Enabler of Next-Gen AI

While AI wearables is a trend consumers will touch in 2026, the transition to Wi-Fi 8 has a more ephemeral quality. Throughout 2025, Wi-Fi 8 existed largely in chipset labs. In 2026, those early silicon samples will move into the hands of access point vendors, IoT designers and smartphone makers. Commercial products won’t reach shelves this year in any meaningful volume, but the design and development phase will be in full swing.

Wi-Fi 8 isn’t designed to be a dramatic technological leap. Instead, it focuses on the foundational requirements of rich AI experiences:

Reliable Access to Data: Whether you’re wearing smart glasses in an airport or managing industrial robots, AI experiences break down quickly when data transfers become unpredictable. Wi-Fi 8 targets greater reliability under congestion and interference – conditions increasingly common in RF-dense environments.

Lower Latency: Many emerging AI applications are latency-sensitive, from real-time translation to predictive navigation and contextual assistance. Even modest reductions in round-trip delays can make AI feel more immediate, personal and trustworthy.Security and Privacy: AI depends on deeply personal data: biometrics, voice patterns, behavior trails. Some processing can stay on-device, but cloud inference remains essential, meaning data must move securely between the device and network.

Robustness equates to signal integrity and resilience while reliability indicates the probability of successful transmission

Together, these pillars make Wi-Fi 8 an AI-enabling technology, even if the connection isn’t always palpable. For test teams, this translates into new performance criteria: measuring latency under load, validating advanced features, assessing coexistence in mixed-radio environments and stress-testing devices in conditions that mimic real-world Wi-Fi.

Companies like LitePoint that have worked closely with early Wi-Fi 8 chipset developers will be best positioned to help device makers navigate the transition. Much of 2026 will be spent translating new silicon capabilities into robust, production-ready products capable of supporting next-generation AI workloads.

6G in 2026: Pre-Spec, But Busy

If Wi-Fi 8 is closing in on its commercial debut, 6G is still playing the pre-spec long game, with most roadmaps pointing to 2030 for initial market deployments. But that doesn’t mean the industry is idle. Behind closed doors it’s grappling with two big challenges.

The first is reconciling the unrealized promise of 5G cellular. 5G was marketed as a transformative platform enabled by heterogeneous networks, network slicing and ultra-reliable low-latency communications. What materialized, mostly, was faster 4G. The majority of global networks remain non-standalone, anchored in 4G cores, while private networks exist primarily as niche deployments. And carriers have not yet realized significant new revenue streams from 5G. These lessons loom large for 6G.

The second reality stems from an emerging hypothesis that 6G will be driven less by speed and more by distributed intelligence. This includes AI embedded throughout the network, massive sensor fabrics and new architectural models such as AI-RAN. The industry is still in exploration mode, evaluating which of these use cases genuinely require new spectrum, new waveforms or new topologies.

6G Spectrum: The FR3 Frontier

Much of the early 6G buzz in 2026 centers on FR3 spectrum (~7-24 GHz), sometimes called the “middle child” between today’s sub-7 GHz cellular bands and millimeter-wave FR2. Below ~16 GHz, FR3 behaves like traditional sub-6 GHz and can use existing packaging and antenna techniques. Above that, it becomes more like FR2, requiring complex antenna arrays and more specialized integration.

But unlike FR2 during 5G development, FR3 is not a clean slate. It’s a patchwork of incumbents that includes satellite systems, weather radar and point-to-point radio links. The 2023 World Radiocommunication Conference deferred major decisions that the WRC-27 confab will have to make to determine how spectrum can be harmonized globally.

Source: https://arxiv.org/abs/2310.06425

In the absence of ratified standards, engineers need flexible test platforms capable of exploring candidate bands, characterizing new modulation techniques, pushing higher-order MIMO concepts and validating performance in contested, fragmented spectrum. The ability to measure what’s possible – before regulators and standards bodies decide what’s permissible – may be what keeps 6G momentum chugging along.

Portending a Year of Preparation

Across AI wearables, Wi-Fi 8 and early 6G work, a common narrative emerges: 2026 is not the year everything changes, but the year we prepare for everything that will.

AI wearables are redefining human-device interaction for the first time since the smartphone. Wi-Fi 8 development will give AI devices the reliability, latency and security they require. And 6G experimentation will lay the engineering and spectrum groundwork for the next era of network intelligence.

For wireless engineers and test teams, 2026 is a pivotal year. It’s when we refine the methods needed to validate devices worn on the body, characterize Wi-Fi 8 in chaotic real-world conditions and explore 6G candidate bands long before they become standardized. The work done now will determine how fast the industry can translate today’s AI ambitions into tomorrow’s reality.

The past year of wireless innovation may have lacked a major headline, but it did provide a steady stream of advances that marked a tipping point for a handful of technologies and end applications. The launch of the iPhone 16 provided Wi-Fi 7 with a watershed moment as it flipped from early adoption to mass deployment. Bluetooth® Low Energy (LE) remained the fastest growing Bluetooth segment thanks to robust take-up rates in wearables, smart homes, IoT infrastructure and growing interest in features like channel sounding and high data throughput (HDT). Ultra-wideband (UWB) and 5G gained ground in automotive as keyless entry and telematics enablers, while the healthcare industry continued laying the groundwork for AI-enabled wireless medical devices.

Wi-Fi 7 Moves From Early Adoption to Volume Production

By the start of 2025, Wi-Fi 7 – still fresh from IEEE standard ratification – debuted as the new short-range communications pathway for the iPhone 16. With multiple frequency bands operating simultaneously, Wi-Fi 7 was designed to drive faster speeds, lower latency and more reliable connectivity. The new standard quickly assumed its place at the flagship wireless technology for a host of smartphone platforms, laptops, home routers and portable hotspots as large venues such as airports began rolling out Wi-Fi 7 networks.

Technically, Wi-Fi 7’s big selling points are wide 320-MHz channels and 4096-QAM modulation, which deliver multi-gigabit peak throughput. Those same features made life interesting for manufacturers and test engineers, stretching the performance limits of some RF front-ends and test equipment as Wi-Fi 7 demanded high-volume test strategies.

From a deployment perspective, Wi-Fi 6 had already introduced OFDMA and other efficiency improvements, but in doing so made coexistence with legacy Wi-Fi challenging. Networks often had to choose between “modern” and “legacy” operating modes, losing some functionality when older devices were present. Wi-Fi 7 addressed that problem with preamble puncturing, which allows a router to carve out and “puncture” a portion of a channel that is experiencing interference, instead of rendering the entire channel unusable.

In practical terms, 2025 was the year Wi-Fi 7 became both faster and easier to deploy than its predecessor, in larger part by reducing interference with installed Wi-Fi 5/6 gear. For end users, that meant more consistent throughput in crowded homes and offices and lower latency for gaming and video conferencing.

Bluetooth Sharpened Its Focus

Bluetooth’s story over the past 12 months focused heavily on two areas: precise location detection using channel sounding and the long-gestating HDT mode.

Channel sounding – a star feature of the Bluetooth 6 roadmap – improves how well Bluetooth devices can locate each other. Instead of relying only on received signal strength, channel sounding uses phase and frequency information from multiple antennas to estimate distance and direction with far better accuracy. That’s crucial for asset tracking, indoor navigation, secure access and proximity-based applications. Test and measurement efforts in 2025 helped to verify these new ranging behaviors in real-world multipath environments.

In parallel, the industry made progress standardizing HDT. For several years, major silicon vendors had shipped proprietary “boosted” Bluetooth modes, each with its own branding and slightly different implementations, to deliver higher data rates for features like lossless audio. In 2025, the Bluetooth SIG’s work on HDT began harmonizing those individual efforts into a common specification capable of about 8 Mbps, enabling high-resolution and spatial audio with better interoperability.

* Optional Rate | Source: Bluetooth SIG

The challenges here were twofold. First, vendors had to enable HDT without blowing up power budgets in earbuds, wearables and battery-powered sensors. Second, interoperability testing became much more demanding when validating location accuracy and multi-megabit streaming.

Automotive High Notes Featured Digital Keys and Telematics

Automotive was a mixed economic story in 2025, but the wireless content in cars continued to climb. Two trends stood out: UWB-based digital keys and 5G-enabled telematics.

On the access side, UWB moved closer to becoming the de-facto technology for secure keyless entry. The Car Connectivity Consortium (CCC) expanded its Digital Key Certification Program to cover Bluetooth LE and UWB in addition to near-field communication (NFC), ensuring that phones and cars from different vendors can interoperate securely. Meanwhile, organizations such as FiRa Consortium and CCC coordinated on UWB ranging and security testing, bridging what were historically separate consumer-electronics and automotive worlds.

The challenge for in-vehicle wireless connectivity is that UWB has to coexist with Wi-Fi, Bluetooth and cellular systems in the same space, and in some bands nearby Wi-Fi deployments can trample UWB channels. This drove new work on UWB channel harmonization, coexistence strategies and more rigorous validation of ranging accuracy and resilience in noisy environments.

In parallel, car telematics and connectivity continued moving toward 5G adoption. Market studies projected automotive-grade 5G module revenue in the $3.75 billion range for 2025, with nearly 21 percent CAGR through the next decade as modules feed telematics control units and network access devices. These units carry emergency-call services, over-the-air software updates, fleet telemetry and, increasingly, data for ADAS and autonomous driving features.

Here the engineering challenges revolve around reliability and lifecycle: radios must survive harsh thermal and mechanical environments for 10-15 years, maintain secure connectivity over multiple cellular generations and coexist with the car’s internal networks. Module-based designs and more integrated chip-on-board solutions both saw interest in 2025, with OEMs balancing flexibility, certification complexity and cost.

Wireless Medical Devices: Laying the Foundation for AI

Wireless medical devices weathered a year marked by cautious optimism, with Mordor Intelligence estimating the market generated $31.43 billion.

Growth is expected to include wireless infusion pumps and ECG patches, Bluetooth-enabled blood pressure cuffs and glucose monitors and data-connected bedside monitors, but the challenges are substantial. Devices must be tested to meet strict safety and cybersecurity requirements, operate reliably in noisy, RF-dense hospital environments and run for days or weeks on cell-sized batteries.

In retrospect, 2025 may have been a foundational year in which healthcare connectivity and data pipelines were hardened to support AI-enhanced wearables and similarly ambitious deployments in the years to come.

The Rearview Mirror Reveals What Lies Ahead

For many wireless technologies, 2025 was more a year of maturation than movement. Wi-Fi 7 became deployable at scale; Bluetooth updated a roadmap to higher throughput; cars added more radios for keys and telematics; and wireless medical devices edged closer to AI implementation.

The threads that touch each area includes coexistence with legacy systems, the move from pilot projects to mass deployment and the need to turn raw bandwidth into dependable, often safety-critical services. These solutions are setting the stage for what comes next: Wi-Fi 8, 6G, AI-native radio networks and data-centric healthcare.

Ultra-wideband (UWB) is a short-range wireless communication technology that uses very wide frequency channels – typically 500 MHz wide – to enable secure, low-power communication. Its standout feature is fine ranging: the ability to precisely measure the distance between two devices by calculating signal travel time.

In automotive applications, UWB has become an essential technology, especially for enabling secure digital keys and passive entry systems. As UWB adoption in vehicles increases, challenges such as spectrum congestion and interference are becoming more pronounced.

 

UWB operates outside the normal rules of licensed and unlicensed spectrum under FCC Part 15, and shares airwaves with other wireless technologies, including Wi-Fi in the 6-GHz band. This makes it susceptible to signal interference as UWB adoption grows across automotive and Wi-Fi-dependent consumer devices.

The potential for signal congestion is driving a move to harmonize UWB channels to accommodate more devices, reduce interference and future-proof new deployments. Unusually, the effort is being led by carmakers and Tier 1 suppliers, with the typically slower-moving automotive sector setting the pace for phone vendors and other consumer electronics manufacturers.

Channel Expansion Frequencies

Although the IEEE 802.15.4 standard defines 16 UWB channels, regulatory and other restrictions effectively limit all UWB traffic to one channel (channel 9), which operates in the 8-GHz frequency band. Channel 9 is globally available and free from interference with other common technologies. But as deployments scale, relying on a single channel limits performance, increases the likelihood of device interference and impedes widescale UWB adoption.

To alleviate this, the industry is investigating a move to harmonize UWB support across additional channels, specifically channels 10 (~8.4 GHz) and 12 (~9 GHz). These offer wide global availability, relatively clean spectrum and are expected to dominate future expansion efforts. Other channels, including 13 through 15, have also been defined but are considered less suitable for low-power UWB technology due to effects such as increased path loss, which can degrade signal strength.

What UWB Channel Expansion Means for the Automotive Sector

For automakers and Tier 1 suppliers, access to additional channels would support greater redundancy and flexibility in safety applications. The most prominent of these is digital car keys, where UWB is used to securely access the vehicle. UWB’s fine ranging capability – accurate to within centimeters – helps prevent relay attacks by verifying that a key fob or smartphone is physically close to the vehicle.

Other automotive uses accelerating the need for additional UWB channels include:

  • Child presence detection, seatbelt reminders and automatic trunk access, all of which rely on UWB-based radar to detect and locate objects and people.
  • Autonomous charging alignment, which guides robotaxis and other electric vehicles to precisely dock with charging stations.
  • Battery management systems, where UWB can replace bulky wire harnesses by transmitting data, such as charge state and temperature, between battery cells.

Testing Requirements for UWB in Automotive

As the automotive industry increases its reliance on UWB, test complexity is increasing. Unlike many consumer devices that rely on pass/fail or “go/no-go” tests, automotive systems demand more rigorous examination – particularly at the RF physical layer.

Parametric testing is essential to ensure UWB devices meet quality, reliability and regulatory standards over their lifecycle – from R&D and validation to high-volume production. This includes metrics such as:

  • Signal quality across multiple frequencies.
  • Timing precision for fine ranging and radar use cases.
  • Robustness in multi-device and multi-channel environments.

As production volumes grow, multi-device test efficiency becomes critical. Test solutions must balance cost and throughput while maintaining tight tolerances, enabling OEMs to handle millions of devices per year and still meet the automotive sector’s quality standards.

The testing approach also depends on which integration model is being used. For instance, if a Tier 1 supplier integrates silicon from a chipset vendor with their own front-end and other components, additional testing may be required at the module or system level. In contrast, the use of fully integrated modules, which may only require the addition of an antenna, can reduce the test burden but still necessitates thorough verification during vehicle assembly.

With UWB in a state of flux, there’s also a need to future-proof test equipment. LitePoint’s systems already support all UWB channels to 10.6 GHz.

UWB Is Becoming a Fixture in Automotive Applications

UWB has already proven itself as a vital automotive technology thanks to its excellent security, precise location and measurement capabilities, and low power consumption. As the number of applications grows, channel expansion is both necessary and inevitable to maintain performance, minimize interference with other UWB and Wi-Fi devices, and enable deployments to meet anticipated demand.

For automotive OEMs and Tier 1 suppliers, this means not only preparing for more channels but establishing comprehensive test protocols capable of handling increasingly complex system integration. Getting this right will ensure UWB continues to enable secure, intelligent and feature-rich vehicles in the years ahead.

3 for 3: UWB, The Latest Enhancements

UWB is advancing with the 4ab amendment, bringing meaningful improvements in reliability, speed, and scalability. Unlike other wireless technologies focused on high data rates, UWB excels in real-time positioning and secure access. With growing adoption in mobile and automotive devices, the technology is evolving. Our video series, 3 for 3, provides 3 answers for 3 pressing questions about trends in wireless test. In our latest video, LitePoint’s Adam Smith dives into the latest enhancements in Ultra Wideband, or UWB, technology.

Watch video here

From Bluetooth-enabled glucose monitors to Wi-Fi-equipped imaging systems, connected wireless medical devices are playing a more prominent role in home and healthcare settings by ensuring critical health data is transmitted securely, reliably and in real time. The ability to shuttle data to and from the cloud is a game-changing convenience for patients and clinicians. But the process of embedding wireless features introduces a new layer of design and manufacturing complexity for medical device makers that must meet operating specifications, regulatory requirements and user expectations – every time.

 

 

Given the scope of wireless connectivity options, a comprehensive test program is a must-have component of any medical device product development toolkit. Far from a single event, test is a multi-stage process that evolves across each phase of design and production. When incorporated strategically into the development cycle, wireless testing provides more than validation – it becomes a competitive advantage that ensures product reliability, accelerates time-to-market and reduces long-term costs.

LitePoint has built its reputation delivering leading-edge test solutions across industries ranging from consumer smartphones to automotive and industrial IoT. By applying our expertise to medical device design, we help customers make the right technology choices, guide them through each test stage and future-proof test setups for decades-long product lifecycles.

Selecting the Right Wireless Technology

Medical device companies are experts in clinical function, but adding wireless components can introduce new challenges. Their priority is accurate diagnostics, safe operation and regulatory compliance. Given the wide array of cellular, Wi-Fi, Bluetooth and other communications options, outfitting devices with wireless capability requires careful deliberation.

 

 

Often, the first step when adding a wireless feature is choosing a compatible chipset or module supplier. For high-volume consumer electronics companies, vendor selection is often a clear choice given the deep partnerships they may have with wireless module or chipset providers and their manufacturing partners. Medical device makers may lack the same level of insight into the wireless ecosystem.

This is where LitePoint adds value even before design begins. Our close work with wireless chipset makers across industries means we understand which

  • chipsets have robust test tools versus those requiring external solutions.
  • modules or SoCs deliver easier integration with medical form factors.
  • platforms offer better long-term support and lower testing overhead.

By acting as a consultant at this early stage, LitePoint helps medical device companies avoid downstream surprises and build a foundation for reliable wireless performance.

R&D: Characterizing Device Performance

Once a design direction is chosen, R&D teams move into prototyping. Here, wireless testing is highly exploratory.

Calibration & Characterization: Engineers test single devices, often using hardwired setups, to measure fundamental parameters such as transmit power, EVM, frequency accuracy, receiver sensitivity and more.

Antenna Validation: Typically requires over-the-air (OTA) tests to characterize the performance of the antenna.

Debugging & Optimization: Results inform early design changes before hardware decisions are locked in.

The characterization stage requires a meticulous understanding of the OTA test environment, including accurate link budget calculations, optimal component selection – from the cable to antennas to choice of chamber – and precise calibration. LitePoint’s experts advise strategies to prioritize test parameters, structure test environments and identify early measurements that save time downstream. This emphasis on visibility helps to unearth problems in early-stage product development.

Design Verification: Regression Testing for Repeatability

Regression testing ensures performance is consistent, both under ideal conditions and across operating ranges. For medical devices, this is critical as regulators and healthcare providers expect reliable performance regardless of interference, distance or use case.

Once prototypes stabilize, teams transition to DVT, or design verification testing. Now, the focus shifts from exploration to repeatability. This entails running the same test cases across multiple frequency bands and power levels and stress testing hardware under variable conditions.

 

IQfact+ Software

 

This is where automation becomes essential. Manually repeating hundreds of test cases is impractical. LitePoint’s automation software allows teams to accelerate regression testing, process large datasets and uncover inconsistencies in performance before validation.

Quality Assurance: Testing Under Real-World Conditions

Even the best lab results must hold up in the real world. Before transitioning to manufacturing, devices undergo quality assurance or output quality check testing. Unlike earlier phases, these tests combine clinical use with wireless connectivity:

  • A Bluetooth blood pressure monitor must transmit to a smartphone without dropouts.
  • A glucose patch must sync with a cloud app, even while performing clinical functions.
  • A wearable ECG must stream data in environments with background Wi-Fi traffic.

 

This stage blends user experience testing with RF validation. LitePoint helps medical device makers recreate realistic scenarios inside controlled environments, capturing RF metrics while applications are running.

Manufacturing: Scaling Test Without Sacrificing Quality

When products move into high-volume manufacturing, the goals shift again to guarantee testing scales on pace with production, that a high percentage of devices pass first-time tests and that test costs are managed to protect product sales margins.

Unlike R&D and DVT, manufacturing test environments often deal with packaged devices where direct RF connections aren’t possible. Over-the-air testing, chambers and multi-device setups become critical.

LitePoint specializes in multi-device manufacturing test solutions that optimize throughput without compromising accuracy. By leveraging multiple integrated vector signal generators and analyzers, combined with proprietary automation, LitePoint helps customers reduce idle test time, improve yield and lower cost per unit.

Future-Proofing Test Beds

Unlike most consumer electronics, medical devices have long lifecycles. Once approved by regulators, manufacturers prefer to maintain test setups for years without disruption. Any change risks new compliance hurdles.

At the same time, wireless standards are constantly evolving. Bluetooth, for example, has already graduated from version 4.0 to 6.0. Wi-Fi 6 is giving way to Wi-Fi 7 and soon Wi-Fi 8. Medical device makers must plan for these shifts even if today’s device uses an older standard.

LitePoint helps customers future-proof their test investments by delivering platforms that support multiple wireless generations, ensuring software updates keep pace with new standards and providing long-term service and support to extend the life of test setups.

Test as a Strategic Advantage

Medical device manufacturers should avoid treating wireless testing as a late-stage checklist item. As this lifecycle analysis shows, testing evolves from vendor selection to R&D characterization, verification, quality assurance, manufacturing and long-term sustainability. When approached strategically, test is not just another compliance step. It’s a competitive advantage for reliably connecting life-enhancing medical instruments, reducing time to market, optimizing yield and manufacturing costs, and protecting investments against evolving standards.

LitePoint is uniquely positioned to guide medical device makers through this journey. With deep expertise across wireless technologies and proven partnerships with chipset vendors, we act as both a test provider and a consultant helping customers succeed before the cradle of design and beyond high-volume manufacturing.

With wireless connectivity evolving at breakneck speed, I recently had an opportunity to hit the pause button and take stock of 5G open radio access network (O-RAN) technology and what the future holds for O-RAN radio unit (O-RU) testing.

In my musings, I was joined by RCR Wireless News magazine editor-in-chief, Sean Kinney, who led a panel discussion on the evolution of O-RAN open innovation. Sean set the stage with a reminder that O-RAN is a movement founded by wireless operators seeking to diversify suppliers, increase flexibility, reduce costs and effectively “break apart” the RAN to expand its reach and appeal. In the past several years, that’s led to the release of more than 100 specifications covering everything from interfaces to platform orchestration.

O-RAN Trends to Watch

More recently, the O-RAN Alliance – and the industry at large – has championed cloud-based radio access as a means to conserve energy and enhance spectral efficiency. The ability to tie into the cloud is also spurring efforts to leverage AI to support shared O-RAN compute resources and edge-based services.

With 6G on the horizon, O-RAN is evolving beyond enhanced mobile broadband. Features like “mini-slot” scheduling allow for unencumbered data movement, because data no longer has to wait for the entire time slot to elapse before transmission can begin. This is important for time-sensitive technology like ultra-reliable low-latency communication (URLLC) in applications ranging from industrial automation to autonomous vehicles.

 

 

2025 is also shaping up to be a busy year on the integration front. O-RU designs are moving away from bulky FPGAs toward system-on-chip solutions as a means to significantly reduce power consumption and lower BOM costs. The bar is high in terms of cost cutting, but the goal is to make O-RUs as cost-effective as enterprise access points, which is critical for O-RAN to thrive in a competitive marketplace.

Interoperability Testing Ensures O-RAN Adoption at Scale

These exciting radio initiatives have a common trait: they depend heavily on smarter testing and integration. The O-RAN community has spent years promoting O-RU interoperability, for example, but it’s time to move past paper conformance and lean into real-world feedback to better manage vendor test strategies.

At LitePoint, RU testing is no longer just about verifying boxes on a checklist – it’s value is best appreciated when validating performance under pressure. For years, we focused on meeting front-haul and 3GPP conformance requirements, but now, with real carrier deployments underway, network operators are pushing testing into four key areas:

  • Real-world scenarios that simulate traffic slicing, bursty data and mixed-service types
  • Stress testing with full-bandwidth utilization, high modulation rates and peak-to-average power challenges to push the front-end modules to their limit.
  • Multi-vendor compatibility that ensures radios adapt seamlessly to different distributed units.
  • Proprietary features unique to specific carrier deployments.

MIMO Migrates to Parallel Antenna Testing 

Another area ripe for innovation is MIMO antenna testing. Traditional antenna-by-antenna (or chain-by-chain) testing offers only a narrow view of performance by focusing mainly on power measurements. More complete MIMO testing reveals how an entire system performs under real-world conditions.

A case in point is error vector magnitude (EVM) – a key metric that quantifies the difference between an ideal transmitted signal and the actual received signal. In real-world tests, MIMO measurements can uncover performance degradation that single-chain tests completely miss. Problems like crosstalk, PCB layout errors, front-end module limitations and potential issues with power supply bypass capacitors are uncovered only when multiple antennas operate in parallel, as they would in live deployments.

By measuring all chains simultaneously, parallel MIMO antenna chain testing can reduce costs by doubling test throughput and unit-per-hour test rates. As manufacturers prepare for large-scale production, MIMO testing is becoming the gold standard, not just for performance validation but for enabling scalable, cost-effective manufacturing.

Automation Is the Key to Scale

Automation is where O-RAN testing gets truly exciting. One of the beautiful things about O-RAN is the ability to take advantage of the M-plane, or fronthaul interface, that physically connects the O-RU and the distributed unit (O-DU).

 

Thanks to the O-RAN M-plane interface, we can now create a relatively generic automation framework that allows any vendor to simply drop in their specific M-plane XML file for the radio control. This makes it easy to automate radio testing from different vendors to validate performance, optimize through calibration and quickly scale to high-volume manufacturing.

O-RAN Reality Check

Despite the gloomy headlines about O-RAN’s commercial struggles, innovation hasn’t stopped. If anything, the push for lower costs, better interoperability and smarter test solutions is accelerating. The ecosystem remains vibrant, but success will depend on whether we can keep driving performance up and costs down while ensuring reliability in every real-world deployment.

To that end, every step of O-RAN radio development – from raw silicon to finished goods – relies on product-level testing. Isolated testing of RUs is no longer only practical; it’s the only way to capture and optimize key performance metrics while identifying failures early. As someone who’s deeply invested in RU testing, I can say this: the work we’re doing today is laying the groundwork for O-RAN’s next big leap.

Medical devices are increasingly wireless, supporting patient care, monitoring and diagnostics through technologies like 4G LTE/5G cellular, Wi-Fi and Bluetooth®. When defining their products, medical companies invariably encounter a key integration option: should they embed wireless functions using a proprietary chip-on-board (CoB) design or rely on a pre-certified module from a third-party provider? The decision can directly influence product development costs and timelines, regulatory testing loads and final product performance.

This foundational design decision is becoming more important as the medical community strives to meet demand for wearable monitors and diagnostic devices and expands use cases for in-home remote health management and rural locales where hospital access may be difficult. The differences between these approaches have a direct bearing on how manufacturers design, certify and test their devices as they prepare them for volume manufacturing. And as demand grows, many medical companies find that their “make vs. buy” decisions are informed by the degree to which they have access to in-house RF engineering expertise.

Balancing Flexibility and Simplicity During Product Development

For space-constrained or highly customized devices, chip-on-board affords designers tighter control over integration and device operation. This enables more flexibility when it comes to product form factor and performance features. However, CoB also requires significant RF engineering knowledge; it depends on close – and often time-intensive – collaboration with chipset vendors, and demands extensive debug cycles during development.

Source: Ezurio

Modules simplify the path to integration. These off-the-shelf components are pre-tested and pre-certified, reducing the burden on device manufacturers. They enable faster development cycles and are often preferable for medical companies that prioritize clinical function over RF customization. The trade-off is cost – modules are typically more expensive – but they reduce engineering time, complexity and risk.

A common hybrid approach involves combining both: for instance, using a 5G module due to regulatory complexity and CoB for Wi-Fi to save space or cost.

How Manufacturing Affects Test Time and Scalability

Testing is critical in wireless device production. CoB designs require more comprehensive RF test coverage, including direct chipset control and access to factory test modes. While this enables extensive validation, it also increases test time and demands more from manufacturing teams.

 

Modules, on the other hand, have limited access to internal test modes. Testing is often conducted through AT (attention) commands with reduced test coverage. This results in shorter test times and easier scalability for high-volume production.

Both approaches scale similarly in terms of physical factory setup; it’s test throughput per unit that differs. LitePoint supports both scenarios with test automation solutions and multi-device testing platforms that optimizes units-per-hour (UPH) even with expanded test coverage.

Juggling the Variabilities of Compliance and Final Product Testing

CoB-based designs place the full regulatory burden on the OEM, including FCC wireless compliance in the United States and international certifications such as EU-mandated conformity assessment (CE) testing. This includes emissions, coexistence and RF exposure tests. Modules alleviate much of this by offering pre-certification, especially for complex technologies like 5G, which operate in licensed spectrum.

However, assuming a module requires no further testing is a common mistake. Even pre-certified modules must undergo final product validation, as antenna placement, housing materials and PCB layout can affect wireless performance. LitePoint emphasizes the need for end-to-end testing, ensuring the assembled product performs to specification – especially for mission-critical medical applications.

 

Each wireless protocol – Bluetooth, Wi-Fi, 5G – has unique test cases. Bluetooth Low Energy and Bluetooth Classic, for example, differ in respect to the modulation scheme, limits on frequency deviation, sensitivity, etc. When products support multiple radios, coexistence testing is essential to ensure technologies like Wi-Fi and Bluetooth operating at 2.4 GHz do not interfere with each other.

Why LitePoint is the Right Test Partner

LitePoint’s value extends beyond test equipment. We provide the RF expertise, support and turnkey automation that medical OEMs need, in addition to the industry’s largest installed base of more than 350 wireless chipsets. For companies unfamiliar with RF or constrained by timelines, LitePoint’s controlled test environment offers:

  • Out-of-the-box automation software tailored for chipsets and modules
  • Test coverage for several wireless technologies including but not limited to 4G/5G, Wi-Fi 6/6E/7, Bluetooth Classic and Bluetooth Low Energy
  • Scalable solutions, including over-the-air and multi-device testing
  • Testing that extends beyond Pass/Fail to deliver insights on transmit quality analysis
  • Industry-leading customer support that shortens debug and validation cycle

LitePoint enables medical device companies to focus on clinical performance while ensuring wireless quality and compliance. Whether choosing CoB, module integration or a mix of both, LitePoint simplifies testing and accelerates time to market by delivering faster testing speeds to expedite production throughput and reduces test errors to improve yields.

The choice between chip-on-board and modules in medical electronics isn’t binary. It depends on form factor, engineering bandwidth, time-to-market goals and regulatory obligations. What is consistent, however, is the need for reliable, robust, flexible and scalable testing.

As 5G networks expand, connecting a wider variety of devices is becoming essential. Our video series, 3 for 3, provides 3 answers for 3 pressing questions about trends in wireless test. In today’s video, LitePoint’s Khushboo Kalyani focuses on a critical evolution in the world of 5G that’s set to transform the Internet of Things: 5G Reduced Capability, known as RedCap, and its evolution, eRedCap. Kalyani discusses what 5G RedCap and eRedCap are, where they fit in the IoT landscape, and their primary use cases.

 

WATCH VIDEO HERE