Are You Sure New Mobile Devices Are Ready For Your Customers?
With a myriad of new handsets now in the market, mobile service providers are anticipating the latest devices from Apple, Samsung, Huawei, Google, and others will encourage subscribers to upgrade both their hardware and their contracts.
But will the latest devices live up to expectations?
How many of us have purchased a new smartphone, then found it seemed slower on downloads, that the sound or video quality on a preferred app wasn’t great, or some other feature was not quite what we expected?
Datasheet specification information for performance metrics from smartphone manufacturers is only the beginning. To understand and measure the user experience, service providers need to know how smartphones will perform in the real world, on a specific provider’s network, with its own configurations and nuances.
When we think about what users experience when good networks and devices go bad, consumers don’t perceive inter-cell interference, codec mismatch, failed carrier aggregation, or excessive graphics chip mA consumption. They simply experience compromised video, calls and app performance.
The best carriers and OEMs in the world test for these experiences, in the lab and in the field. And they compare themselves to their own progress and to other providers.
To do this measurement, some mobile service providers may use periodic, subjective usability tests from test team members, but this is neither objective, repeatable or a substantial assessment of overall user experience.
To really understand how different devices from different vendors perform, a consistent and objective approach is required that will work on any device regardless of manufacturer, device OS or device model.
Implementing device acceptance teams to check objectively if new devices deliver what they promise avoids the knowledge gap between unknown real-world performance and user expectation and can help to minimise the risk of a financial impact.
Spirent has a range of solutions to measure end-user experience of real-world devices and most recently launched a state-of-the art artificial intelligence solution to assess video just like your eyes do. The result is a non-reference algorithm that is trained on thousands of video samples, allowing it to determine a video mean opinion score (VMOS) based on de facto industry standards and correlated to human perceptual scoring. This means it can analyse the content by itself to detect poor quality and perform scoring without prior understanding of the original video.
Learn more about how Spirent is helping organisations to create faster, less expensive and more repeatable quality assurance methodologies for video, voice and data at www.spirent.com/Products/Umetrix.