2012 Evaluation Team Report: Web Browser Evaluation

Executive Summary

The 2012 Web Browser Evaluation Team was tasked with evaluating the impact of the changing web browser climate for desktop and mobile operating systems. It is important to note that the team was not tasked with providing a recommendation on a single best-in-class browser; rather, it was asked to assess the current and upcoming generation of web browsers and consider the implications for Penn’s various constituents. The team divided Penn’s constituents into three separate communities, and provides the following recommendations for each subset:

  1. Developers: The team suggests (at a minimum) that the primary goal for developers should be to provide access from an operating system’s built-in browser offering (for example, Safari latest on OS X ). The secondary goal should be to support a secondary browser for the operating system (for example, Firefox Latest or Firefox ESR). Developers should clearly communicate expected levels of functionality for each browser and application. When access to an application using a particular browser is restricted, the reasons for the limitation should be clearly provided at point of access, in a format that is comprehensible to both End Users and Local Support Providers.
  2. Support Providers: The team suggests that Local Support Providers should utilize a “Browser + 1” model on all managed systems. On unmanaged systems, Local Support Providers should strongly encourage and promote this model. Resultant of frequent version changes, LSP’s should deploy frequent updates to managed systems or allow users to update browsers to ensure mitigation of security vulnerabilities.
  3. End Users: End Users managing their own systems should also adopt a “Browser + 1” model, and utilizing this model should provide a reasonable expectation of the availability of services.

As the environment continues to evolve, schools and centers will need to continue adopt support for a “bring your own” environment as it pertains to web browsers.

Evaluation Methodology

The 2012 Web Browser Evaluation Team considered the built-in and most popular (as of early 2012) web browsers for desktop and mobile platforms. These included Firefox and Internet Explorer (with multiple versions identified), and Safari and Chrome (with the latest version indicated). The team considered University-supported desktop and mobile platforms (with the addition of Android due to its popularity). The team began by identifying pertinent Penn-provided and Penn-affiliated websites with heavy usage by University personnel. After identification, a testing matrix was developed for each combination of browser and application (See: Mobile evaluation and desktop

evaluation sheets). For each combination, a “Yes”/”No” answer was recorded as it pertained to functionality. When an issue was encountered with a specific combination, a note was made indicating (in as much detail as possible) the cause and problem.

In addition, a separate “Pros and Cons” list was developed for each web browser platform. For each point of interest, documentation is provided from a reliable source. While the team does not make any recommendations on best-practices or use-cases for each browser, the hope is that this list can be an aid in making these decisions. (See: Pros and cons)

Assumptions

The team tested baseline browsers with only supported add-ons (for example, Java 6 update 33) and no major customizations to settings that would impact compatibility. The team attempted to maintain the integrity of testing, while realizing that the scope of testing had to be fairly limited (due to time and manpower restrictions). While the team endeavored to accurately and fully test every application for every browser, some of these applications have restrictions to certain features. Where an application was tested and worked for all available functionality that the tester had access to, the team made the assumption that the application was compatible. Due to time constraints, the team also asserted that browsers across similar operating systems work similarly (for example, Windows 7 and Windows Vista Firefox work similarly), except where indicated. Finally, because Android allows for heavily modified user interface (UI) overlays, the team adopted a supported stance for the built-in browser with a “use at your own risk” recommendation for overlays.

Results

In years past, the Browser Evaluation Teams provided a recommendation on the penultimate campus browser. Rather than making a particular recommendation on the recommended browser, the team developed a list of Pros and Cons for each browser on mobile and desktop platforms, to help Penn constituents make an informed decision based on preferences and use case. The list provides information (using cited resources) on browser strengths and weaknesses. The team also performed best-case effort to collect data for most combinations of modern web browsers, operating systems, and applications.

Analysis and Conclusions

As it pertains to developers, the team reiterates that applications should work with University- supported built-in browsers. On desktop platforms, these supported browsers are Internet Explorer 8 and 9 and Safari current. On mobile platforms, these are iOS, Android, and Windows mobile built- in browsers. Developers should work to ensure all technology End Users are equipped with a baseline operating system and built-in browser that supports access to their application. While the optimal environment would be one where access to an application was browser-agnostic, the optimistic view is that developers will strive to support their applications using the “browser + 1” model (the built-in operating system’s browser plus a secondary browser). Where developers are not able to develop compatibility with a specific browser, the rationale for this limitation should be clearly communicated (for example, if a user logs into an application using Chrome and that application is not compatible, the website should disseminate specific reasons why limitation is in place) in a format that is both comprehensible to End Users and technically informative to local support providers.

For local support providers, particularly where a system is manageable, emphasis is put on enforcing or allowing users to maintain an up-to-date browser where possible (particularly easy with Firefox’s and Chrome’s update methodology, yet challenging with Safari and Internet Explorer). To some degree, limitation on customizations that would impede access ensures a fairly homogenous and consistent user experience across the board. When a system is not manageable, communicating appropriate requirements and limitations that has been relayed by the developers is key. When providing recommendations on browsers for particular use case scenarios, LSPs can provide recommendations based on the “Pros and Cons” list. Where browser-agnosticism isn’t possible, the decision on use of browser should be on the merits of the browser itself and user preferences. The push should also be for the “browser + 1” model espoused in this report, ensuring a fallback should developers be unable or unwilling to follow the above recommendations.

As it pertains to End Users, we increasingly see an environment in which users expect more choice about their technological experience. Where access to an application is impossible, the onus is on developers to clearly communicate the specific incompatibility. By informing users about expectations, we can limit End User frustration and the extent to which they are unable to access web-based applications.

Print This Page Share:
Date Posted: June 28, 2013 Tags: EvaluationTeam, Web Browser

Was this information helpful?

Login with PennKey to view and post comments