Your company’s website has a large influence on your customer’s impression of your business, it’s their window into your world. And if your product is a web-based system then every detail counts.
While you may spend a lot of time designing how your website or system looks and performs, you can’t dictate which browser the customer chooses to use to interact with it. Different browsers give different results, with some performing better than others. Ensuring that your customers have a seamless, consistent, and issue-free experience is key to building a trusted brand and loyal customer base.
This is where cross-browser testing comes in.
Up to 88% of online consumers are less likely to return to a website after a bad experience. Additionally, 85% of adults expect a company’s website to look and behave just as well or better on mobile devices as it does on desktops. [Source: ‘27 Eye-Opening Website Statistics‘ by Sweor, 20 Jan 2022]
To understand how pivotal cross browser testing is for your web applications, let’s define what it is, lay out the best practices for getting it right, and identify some of the premier tools for the job.
What is cross browser testing?
Cross browser testing is the process of scrutinising and comparing the performance of your website or web application on various browsers, devices, operating systems, and viewport combinations. It also often extends to testing with different browser-OS combinations, assistive tools, and more.
From missing CSS resets to DOCTYPE errors and outdated browser detection, there are many issues that can arise when you combine different browsers, devices, and operating systems. Cross browser testing is a very effective method of removing the likelihood that these issues may present for your customer.
What makes cross browser testing important?
Where a user visits a website for the first time and perhaps it doesn’t render images properly, or the formatting may be misaligned, or there may be elements missing. If things like this happen, there’s a good chance the user will assume that the site is broken. The user will then most likely look for an alternative.
It’s quite probable that you will lose this customer since their return visits will give them the same poor experience if there are browser compatibility issues. Cross browser testing can help you discover website failures specific to a particular browser so you can fix them and avoid alienating a portion of your potential customer base.
Next, we look at some worthy considerations regarding the various facets of cross browser testing – and sum up how important they are.
Adoption and usage rates
Currently, Chrome is leading the way with 64.53% of the market according to StatCounter. Safari comes in second with 18.84% and Edge and Firefox are third and fourth respectively. So why exactly do these numbers matter? Well, the reality is that if a specific browser has a large share of the market, it is very likely that a sizable portion of your target audience will be using that browser. This often means that there’s more to worry about; more browser versions to test with, a greater variety of devices, and operating systems too.
Therefore, it’s prudent to keep tabs on usage rates so that you can plan accordingly and avoid getting blindsided by the massive workload that could come with testing on some browsers. However, while it is evident that usage leans heavily towards Chrome, it’s not wise to focus entirely on testing for the top browser.
While these global usage rates are important it’s equally as important to consider the demographic of your customers and use any data that you can gather on their specific setups. Desktop vs mobile, browsers, versions, etc, and skew your test coverage as appropriate.
Testing on browsers with low adoption rates
Browsers with low adoption rates can offer a good opportunity for you to ascertain the website features and functions that may be ahead of the curve. Subsequently, you can find a way to keep the user experience appealing even when some elements aren’t behaving as they should.
For example, you might have some dynamic features like automatic slideshows, tickers with play/pause buttons, and other page navigation elements. But on a much rarer browser, some of these features may not work as desired. The ticker may remain static, or the slideshows may require manual scrolling, etc.
Testing on these browsers can help you polish your defensive coding. This is basically the creation of substitutes that a website can revert to if certain functions aren’t working correctly. For instance, a slideshow can be replaced with a banner that has all the slides. Some replacements will be simple graphical alternatives while other functions may require more ingenuity.
Ultimately, visitors won’t have to miss out on any information or fail to sign up/check out simply because a pop-up window is malfunctioning, or some other glitch is getting in the way.
Perhaps the rarer browser is coming into favour rather than going out of fashion. This is where you can easily get a head-start on your competition. Some of the newer and less popular web browsers come with features to give them an edge such as data saving and other security measures.
Eventually, they open doors for new demands from customers who may later want these features but without a change in the user experience on the website. Testing can help you learn how to tailor your website to these browsers and the instructions they apply to CSS, HTML, XHTML, and other files.
Once the more established browsers catch on and start incorporating those new features, you may not need massive adjustments to your web project. Whether it’s a more thorough ad-blocker, a parental control filter, or another app integration, you’ll be able to tailor your site to it.
If not, you can create a message letting the user know that they should turn off a specific browser setting, add-on type, or some other configuration.
ROI of testing on older browsers
When building a website, you might try it out on the latest versions of the popular browsers you know. However, not everyone in the audience will be using those versions when your site goes live.
Cross browser testing can let you know whether your website will perform well on older versions of various browsers. This way, you won’t have to miss out on the visitors who aren’t always prompt with their browser updating.
Keep in mind that some users will be accessing your website through company computers that may have security restrictions that curtail automatic updating. So until the IT team do an office-wide update, employees will be using an older browser version and you’ll have to reach them through it.
Furthermore, with access to browser version usage data, you can estimate the number of users who may have a poor user experience as you discover the issues tied to different versions. This is particularly helpful if you have limited resources for revising your website’s code.
You can launch the website, then gradually make the necessary changes as you monitor the number of visitors using older browser versions. If the numbers go on to show that the bulk of your visitors are using older browser versions, you can slow down on revisions meant for the latest version and resume once the shift to the latest version approaches substantial levels.
This move may seem unnecessary, but once you realise just how frequently some browsers roll out newer versions, you’ll understand how tedious it can be to keep up with them in real-time.
Standard Operating Environment (SOE) vs. customer usage
Think of a standard operating environment as a blueprint that stipulates the type of operating system and accompanying software to be used throughout an organisation. An SOE can simplify cross browser testing since you’d basically be limiting your testing scope to the combination being used in an organisation.
However, SOEs still involve software updates from time to time, meaning that you’d still have to test in an updated SOE at some point.
Therefore it’s better to rely on customer usage numbers since they paint a more in-depth picture of the different operating environments available amongst your target audience. In addition, for internal/employee usage, many organisations are adopting the “Bring Your Own Device” (BYOD) practice, so it becomes much harder to tie a bunch of users to one browser-OS combination.
Best practices for cross browser testing
Cross browser testing can happen both during website or application development, and after the first major release, where browser tests focus on updates to the website and their cohesion with ever-evolving browsers.
None-the-less, there are some core steps worth following, regardless of the stage you’re at:
+ Ascertain features to be tested
The first step is to identify every feature you want to test. You need to rank these in the order of their importance such that crucial elements like signing up or checking out after shopping are working as expected. Listing the features to test is also a foundational step in creating a well-organised plan for how you will test.
The main categories to test usually include basic functionality, that is, timely display and positioning of dialogue boxes and menus, validation of form field inputs before acceptance, implementing first-party cookies and personalisation, and flawless touch input for tables and other mobile devices.
Other realms of features tested can be design (fonts, images, banners, layout, etc.), compliance with accessibility standards, and responsiveness (ability to adjust to different screen sizes and orientations). This step also involves running the website on your primary browser to get a frame of reference for what it should look and behave like on other browsers.
+ Select browsers
Since there are a multitude of browsers (Chrome, Firefox, Safari, Opera, Edge, etc.) out there, it helps to narrow it down to the browsers that are prevalent amongst your target audience in the beginning. This can be done based on:
- Popularity – Select the top ten browsers on the market and the top operating systems on which they are commonly run such as Android and iOS.
- Analysis – Analyse your website’s traffic statistics and look through reports on players in your industry/field to find out what devices and browser-OS combinations are most common amongst your visitors. Endeavor to test with any browser-OS combination that brings in 5% or more of your traffic.
+ Harmonise manual testing with automation
Go through the list of all tests to be run, then assign them to human testers or automation tools. Typically, you want to assign all the straightforward tests like checking the appearance of buttons and menus to human beings.
For the more intricate functions like updating remainder stock figures and generating bill/invoice reviews and order summaries, you might want to automate the tests. Not only are humans more likely to overlook a mistake or make errors of their own when running such tests, but they’ll also probably take more time to complete them.
+ Gather the appropriate tools and set up your infrastructure
When it comes to infrastructure, you can set up virtual machines and emulators and install the browsers you intend to test with. However, test results from virtual platforms may not accurately reflect what it’s like to visit the website from a particular device.
Furthermore, this approach is tricky to scale. The alternative is to set up a device lab with all the devices you need, and have their integrity consistently maintained.
This approach can be quite costly hence the prevalence of cloud-testing infrastructures like Browserstack’s Live which can run tests in a remote lab of secure browsers and devices for a portion of the cost of setting up your own physical device lab.
Effective cross browser testing tools
The premier cross-browser testing tools on the market that are worth a try include (please note prices listed are given in AUD and are correct at the time of publishing):
+ BrowserStack
One of its major advantages is that it will support tests built with Selenium, Playwright, Cypress, and Puppeteer. It also supports automation workflows written in a wide range of languages and frameworks.
But while it offers over 3,000 mobile and desktop browsers to test on, its accessibility test reporting isn’t delivered in a user-friendly manner (within a JSON object). It also doesn’t offer remote debugging via dev tools and makes the reproduction of bugs discovered in live testing much harder.
BrowserStack will set you back $39/month for live testing, and $270/month for automated testing (desktop and mobile) with one parallel test. For more information, visit BrowserStack.
+ LambdaTest
This tool offers a neat user interface with over 2,000 browser-device combinations to test on. It supports over 60 integrations including GitHub, Azure Pipelines, GitLab, and Google Cloud CI. LambdaTest also simplifies bug/task generation. Testing metadata, test framework logs, network requests, and executed commands are available in the test report.
You can generate bugs/ tasks within the testing dashboard or LT browser, and use the in-built issue tracker to assign them to different team members. Unfortunately, LambdaTest is still limited to Selenium or Cypress test automation, heavily favours Chrome, and often delays making newer OS versions available for testing on.
LambdaTest costs $20/month for live testing, and $135/month for automated testing with one parallel execution. For more information, visit LambdaTest.
+ Smartbear CrossBrowserTesting
With this tool, you can screenshot testing and codeless testing using artificial intelligence (AI) to record and play interactions on screen. It offers over 2,000 target browsers and mobile devices, including lesser-adopted browsers like Maxthon, Dolphin, Sleipnir, Camino, and UC Browser.
Smartbear CrossBrowserTesting integrates with numerous CI/CD tools like TeamCity, Jenkins CI, VS Team Services, and others, while also supporting various PHP, Python, and NodeJS testing frameworks like Cucumber, Behat, Lettuce, Codeception, Mocha, and more. You can also rely on AI-assisted visual comparison through Applitools.
With that aside, its automatic record and replay don’t always imitate recorded steps properly. It also lags behind in Apple device support and leans more towards Safari for the supported iOS devices, with only a few supporting Chrome.
CrossBrowserTesting is priced at $42/month for live testing and $145/month for automated testing. For more information, visit Smartbear CrossBrowserTesting.
There are plenty of other cross browser testing tools you can choose from such as Sauce Labs, Applitools, Selenium, Browsershots, Eggplant, Ghost Inspector, Perfecto, Browsera, Experitest, Browserling, Mabl, Functionize, plus many more.
When selecting cross browser testing tools, look out for attributes like the number of browsers supported, ability to integrate with other developer tools, geo-location testing, and real-time debugging.
Conclusion
The premier cross-browser testing tools on the market that are worth a try include (please note prices listed are given in AUD and are correct at the time of publishing):
Like with many other kinds of testing, cross browser testing is a continuous exercise and it’s wise to have your team dedicated to keeping up with the latest changes in the world of browsers. If you’re delivering websites and web applications with frequent updates, you’ll want to balance speedy delivery with quality.
With the right frameworks, you can implement agile methodologies that will enable you to carry out cross browser testing and rectify any issues discovered in rapid iterations. Contact the luvo team via [email protected] for support with incorporating cross-browser testing into your testing frameworks.