๐๐ Upside White Paper: Quality Framework for Sports Technologies
Upside White Paper: Quality Framework for Sports Technologies
By Sports Tech Research Network (STRN)
Executive Summary
Identifying and applying tools to effectively and efficiently evaluate technologies is an area of increasing need for many sports stakeholders. A more robust and comprehensive process to evaluate technology would help to reduce negative effects due to adoption of poor-quality technology, such as low return on investment, technology misuse, or data breach.
This white paper introduces a standardized, evidence-based framework which can be adopted by sports technology stakeholders to assess the value, usability, and quality of technology. Developed in collaboration with 48 experts across the sports industry by means of a Delphi study design, the Sports Technology Quality Framework consists of 25 measurable features grouped under five quality โpillarsโ: Quality Assurance & Measurement, Established Benefit, Ethics & Security, User Experience, and Data Management.
This framework can be used to help design and refine sports technology in order to optimize quality and maintain industry standards, as well as guide purchasing decisions by organizations. It also will serve to create a common language for organizations, manufacturers, investors, and consumers to improve the efficiency of their decision making on sports technology.
The Global Challenge of Evaluating Sports Technology
Technology use is accelerating everywhere, and sports are no exception. The last 20 years have seen exponential growth in the development and use of technology to identify, monitor, train, recover, and rehabilitate athletes [1], [2]. Although these technologies were once confined to elite sport, the rapid democratization of technology and data has created an explosion of opportunities at the amateur and general consumer level as well [3], [4]. Furthermore, the increased commercialization of sport has broadened the notion of sports technology beyond athlete performance. Digital technologies and investable enterprises have emerged across a range of sport-related applications, including fan engagement, stadium experience, venue operations, and entertainment and content creation [5]-[7].
Key stakeholders face numerous challenges when evaluating the value, usability, and quality of products in the rapidly-evolving sports technology marketplace (Figure 1).
High-performance staff, leagues, and governing bodies are inundated with more tech proposals in one week than they could reasonably review in a year. Coaches attempting a simple online search are met with a daunting list of tech options but have limited resources to distinguish hype from substance.
Tech start-ups receive considerable mentoring on how to develop a viable business but minimal direction on how to establish the quality of their product. Likewise, investors may look deep into financial projections for a company while only gaining superficial insight into the technologyโs technical suitability.
Last but certainly not least, in a race for competitive advantage, it is the players that often have a limited voice in how sports technology is used to monitor and intervene on their training and performance.
Identifying tools and processes to effectively and efficiently evaluate technologies is an area of increasing need for many sports stakeholders.
Access to resources and training in this area would reduce the negative effects of adopting ineffective, unusable, burdensome, or unsafe technologiesโincluding poor return on investment, wasted time and resources, unsatisfied consumers, or adverse events. As sports technology continues to outpace user expertise, there is a critical need to implement better education, policies, and processes for evaluating the quality of a given sports technology.
The Need for a Sports Technology Quality Framework
The regulatory environment for sports technology is not well defined. Unlike many other industries, the majority of sports technologies are not required to comply with statutory or regulatory requirements. At best, a patchwork of regulations and policies exist which are largely contingent on the relevant sport, competition level, and geographic region. Additionally, regulations may be present for certain aspects of technology (e.g., physical safety, data privacy) but not others (e.g., accuracy, efficacy, or usability). However, consumers are often unaware of this and generally assume that the product has met some level of technical quality before becoming commercially available.
In many instances, the responsibility to confirm the technical quality of a product ultimately rests with the manufacturer. However, this is also a challenge for manufacturers, who cannot point to a unified standard against which to design and test their product. Also, given competitive and financial concerns, manufacturers rarely disclose technical information on the product, including how it has been evaluated and how it compares to competitor products.
On the academic research side, independent evaluation of a product may take years to conduct. Meanwhile, the technology of interest often has evolved its algorithms, firmware, and even hardware in the intervening period, rendering the research results obsolete as soon as they are published.
Sports governing bodies are marking important strides on assessing the safety and validity of sport technologies [8]-[10].
For example in 2017, the National Basketball Association (NBA) and NBA Players Association created a first of its kind joint Wearables Committee to review requests by NBA teams, the NBA, or the NBPA to approve a wearable device for use by players โ clearly indicating the high priority for a standardized process for assessing sports technology [9].
Picture: Matthew Dellavedova
In parallel, voluntary standards groups have begun rolling out test methods and evaluation criteria for specific health and fitness technology measures [11]-[16]. The sports science and research communities likewise have put forth several thoughtful approaches for decision-making around technology, aimed at various stakeholders [2], [17]-[19]. Despite current advances toward more rigorous evaluation of sports and related technology, a unified global framework for evaluating sports technologies remains sorely needed.
Development of the Sports Technology Quality Framework
Recognizing the aforementioned needs, the Sports Technology Research Network (STRN) convened a working group to develop a standardized, evidence-based, publicly-available framework intended to help sports technology stakeholders evaluate the value, usability, and quality of technology. It was envisioned that this framework would be used to:
Help design and refine sports technology in order to optimize quality and maintain industry standards,
Guide purchasing decisions by facilitating comparison of certain technologies that perform the same function with one another (i.e., optical tracking vs GPS), or certain providers of the same tech with one another (i.e., Sleep watch #1 vs Sleep watch #2)
Create a common language for organizations, manufacturers, investors, and consumers to improve the ease and transparency of discussing sports technology evaluation.
The working group consisted of 11 members from four countries (Australia, United States, Belgium & Germany). From February to August 2022, the group conducted a review of standards, research, and consensus statements on technology assessment in sports as well as adjacent industries, including digital health, psychology, software engineering, security and defense, and e-commerce, and developed a first draft of the framework (Figure 2).
Formal review of the framework was undertaken in the form of a Delphi approach [20]. The working group contacted 110 experts in the sport technology field to review and comment on the draft framework, of which 48 experts participated. This expert panel was selected to represent key stakeholder groups, including: sports governing bodies and leagues, teams, practitioners, athletes, manufacturers, investors, educators, researchers, and consultants. Members of the working group also presented the draft framework at the Sports Tech Research & Innovation Summit (STRN, Ghent, Belgium, September 2022) and the Federation Internationale de Football Association (FIFA) Research Symposium (Zurich, Switzerland, October 2022). Following the first round of expert feedback, the working group convened to revise the framework. The revised framework was then submitted to the expert panel again for review until consensus (Agreement > 75%) was reached on all aspects of the framework [21].
The framework is structured in two levels:
Pillars: Five high-level groupings of similar quality features.
Features: Unique measurable aspects of quality of a sports technology The following tables describe and define the features for each pillar, along with relevant practical examples.
Recommended Use of The Quality Framework
Framework Specifications
The framework serves to provide users with an objective, systematic tool to assist their sport technology decisions. As such it is:
Non-prescriptive: The framework does not suggest that all features need to be assessed on a technology prior to use.
Not defining good vs. bad: A technology is not necessarily unsuitable for use if it does not reach a certain standard on some of the features.
Inclusive: Intentionally broad to address a wide-range of technologies and applications.
Unweighted: No pillar or feature is by default more important than another.
Accessible: Written in accessible rather than technically precise language, thereby facilitating broad use.
Feature Evaluation
Ideally, each feature should be evaluated against some criterion to determine whether it is sufficiently achieved. However, to keep the framework broad, inclusive, and non-prescriptive, evaluation criteria (โstandardsโ) have been excluded. Realistically, an acceptable performance for a given feature is dependent on the type of technology, the intended application, and the goals of the user (Figure 4). We encourage the user to consider the following sources of input within the context of their specific purpose:
Relevant test standards
Practical knowledge
Relevant scientific literature
Known practical requirements
Planning has begun on future work to support the development of evaluation standards and also to establish a repository of online resources.
Customizing Strategies for Implementation
We encourage users to begin with the entire framework and adapt it to their specific purpose. For example, all features of Pillar A: Quality Assurance & Measurement may be of critical importance for in-shoe pressure sensors, but low applicability to an athlete management system. Likewise, the degree of construct validity that is meaningful will likely differ between a professional football club and an Under 14 developmental squad.
Overall, users should select the pillars and features most relevant to their needs. Other pillars and features can be ignored, assigned lower weight, or organized into a โgatekeeperโ model, where they are not part of initial screening but may still block or qualify the extent of final implementation (Figure 5).
Providing a Common Language
The framework will lay a foundation for a common language for organizations, manufacturers, investors and consumers to communicate and discuss the value of sports technology. The features described and defined in this framework can support unambiguous communication of the evidence and value of tech, for use cases ranging from a start-up company pitching to a sports league to a team looking to upgrade or purchase new technology. This communication flags which features exist and which are lacking, creating clear and mutual understanding between all parties including the end user, thus supporting effective decision-making in the development and adoption of fit-for-purpose sports technology.
Next Steps
This framework represents an incremental, yet important step, toward improving the quality of sports technology. An accompanying scientific manuscript will shortly be submitted to a peer-reviewed journal. Furthermore, our next steps to improve and advance the framework over the coming year include:
Case studies and framework validation: To test the framework, it will be implemented for specific technology questions of selected partner organizations. These case studies will be published, and lessons learned will be used to validate and improve the framework.
Implementation strategies: Various implementation sequences such as the examples (see โCustomizing Strategies for Implementationโ) will be explored and developed in the course of research and industry partnerships.
Centering of standards: While this framework is not intended to be prescriptive, it is expected many features will eventually accrue typical standards (e.g., minimum standards, gradings, pass/fail cutoffs) for specific technology types.
Course development and accreditation: The need for literacy in determining the quality of technology will continue to grow. Opportunities with governing bodies and tertiary institutions to offer formal education in this and related areas are being actively explored.
Informing policy and governing bodies: It is anticipated that governing bodies will use the framework to inform policy and technology adoption for specific leagues and competitions. The working group is available as partners to guide this process.
Entrepreneur education: Training materials and advising services are being considered to improve the technical support provided to start-ups, entrepreneurs, and venture capitalists.
Organizational decision-making: Organizations may improve their strategizing, such as optimizing the roles of their staff, by using the framework.
Acknowledgements
The Working Group would like to acknowledge the valuable contributions of the Expert Panel, without whose selfless sharing of their time and insight this project would not be possible.
Cristine Agresta, Sian Allen, Duarte Araรบjo, Steve Barrett, Johsan Billingham, Alison Campbell, Patrick Clifton, Tanya Colonna, Todd Deacon, Austin Driggers, Nicolas Evans, Pieter Fiers, Allan Hahn, Shona Halson, Arne Jaspers, David Joyce, Billy Lister III, Tiago Malaquias, Brandon Marcello, Xavier Schelling, Wade Sinclair, Pro Stergiou, Stephan Suydam, Jen Swanson, Daniel Taylor, Nicole Townsend, Jan Van Haaren, and others who elected to remain anonymous.
The Working Group thanks the STRN Leadership and Staff Team for making this work possible: Special thanks go out to Elise Van der Stichelen, Ben Van Delm and Bruno DโHulster for the initial setup of the networking initiative, and to Merel Vanoverbeke in particular for the final design of this white paper
Endnotes
[1] L. Torres-Ronda and X. Schelling, โCritical Process for the Implementation of Technology in Sport Organizations,โ Strength Cond. J., vol. 39, no. 6, pp. 54โ59, Dec. 2017, doi: 10.1519/ SSC.0000000000000339.
[2] J. Windt, K. MacDonald, D. Taylor, B. D. Zumbo, B. C. Sporer, and D. T. Martin, โโTo Tech or Not to Tech?โ A Critical Decision-Making Framework for Implementing Technology in Sport,โ J. Athl. Train., vol. 55, no. 9, pp. 902โ910, Sep. 2020, doi: 10.4085/1062-6050-0540.19.
[3] G. I. Ash et al., โEstablishing a Global Standard for Wearable Devices in Sport and Fitness: Perspectives from the New England Chapter of the American College of Sports Medicine Members,โ Curr. Sports Med. Rep., vol. 19, no. 2, pp. 45โ49, Feb. 2020, doi: 10.1249/JSR.0000000000000680.
[4] K. Trabelsi, A. S. BaHammam, H. Chtourou, H. Jahrami, and M. V. Vitiello, โThe good, the bad, and the ugly of consumer sleep technologies use among athletes: A call for action,โ J. Sport Health Sci., Mar. 2023, doi: 10.1016/j.jshs.2023.02.005.
[5] D. Patel, D. Shah, and M. Shah, โThe Intertwine of Brain and Body: A Quantitative Analysis on How Big Data Influences the System of Sports,โ Ann. Data Sci., vol. 7, pp. 1โ16, Mar. 2020, doi: 10.1007/s40745-019-00239-y.
[6] B. T. Naik, M. F. Hashmi, and N. D. Bokde, โA Comprehensive Review of Computer Vision in Sports: Open Issues, Future Trends and Research Directions,โ Appl. Sci., vol. 12, no. 9, p. 4429, Apr. 2022, doi: 10.3390/app12094429.
[7] J. Spitz, J. Wagemans, D. Memmert, A. M. Williams, and W. F. Helsen, โVideo assistant referees (VAR): The impact of technology on decision making in association football referees,โ J. Sports Sci., vol. 39, no. 2, pp. 147โ153, Jan. 2021, doi: 10.1080/02640414.2020.1809163.
[8] FIFA, โStandards.โ 2023. Accessed: Apr. 27, 2023. [Online]. Available: https://www.fifa.com/ technical/football-technology/standards
[9] D. Leung, โNBA teams banned from using wearables data in contract negotiations, player transactions.โ Sports Illustrated, Feb. 02, 2017. Accessed: Apr. 27, 2023. [Online]. Available: https://www.si.com/media/2017/02/02/nba-data-analytics-new-cba-wearable-device
[10] World Rugby, โApproved Devices.โ 2023. Accessed: Apr. 27, 2023. [Online]. Available: https:// www.world.rugby/the-game/facilities-equipment/equipment/devices
[11] ASTM International, โStandards Products.โ 2023. Accessed: Apr. 27, 2023. [Online]. Available: https://www.astm.org/products-services/standards-and-publications/standards.html
[12] Consumer Technology Association, โStandards.โ 2021. Accessed: Apr. 27, 2023. [Online]. Available: https://shop.cta.tech/collections/standards/health-and-fitness
[13] Health software โ Part 2: Health and wellness apps โ Quality and reliability, ISO/TS 82304-2:2021, Jul. 2021.
[14] State of Victoria, Australia, Department of Health, โDigital health capability framework for allied health professionals.โ Victorian Government, Dec. 2021. Accessed: May 01, 2023. [Online]. Available: https://www.health.vic.gov.au/sites/default/files/2021-12/digital-health-capabilityframework-for-allied-health-professionals.pdf
[15] The Digital Medicine Society (DiMe), โThe Playbook - Digital Clinical Measures.โ 2023. Accessed: Apr. 28, 2023. [Online]. Available: https://playbook.dimesociety.org
[16] J. M. Mรผhlen et al., โRecommendations for determining the validity of consumer wearable heart rate devices: expert statement and checklist of the INTERLIVE Network,โ Br. J. Sports Med., vol. 55, no. 14, pp. 767โ779, Jul. 2021, doi: 10.1136/bjsports-2020-103148.
[17] C. J. Ringuet-Riot, A. Hahn, and D. A. James, โA structured approach for technology innovation in sport,โ Sports Technol., vol. 6, no. 3, pp. 137โ149, Aug. 2013, doi: 10.1080/19346182.2013.868468.
[18] S. Robertson, A. F. Burnett, and J. Cochrane, โTests Examining Skill Outcomes in Sport: A Systematic Review of Measurement Properties and Feasibility,โ Sports Med., vol. 44, no. 4, pp. 501โ518, Apr. 2014, doi: 10.1007/s40279-013-0131-0.
[19] G. I. Ash et al., โEstablishing a Global Standard for Wearable Devices in Sport and Exercise Medicine: Perspectives from Academic and Industry Stakeholders,โ Sports Med., vol. 51, pp. 2237โ2250, Nov. 2021, doi: 10.1007/s40279-021-01543-5.
[20] F. Hasson, S. Keeney, and H. McKenna, โResearch guidelines for the Delphi survey technique: Delphi survey technique,โ J. Adv. Nurs., vol. 32, no. 4, pp. 1008โ1015, Oct. 2000, doi: 10.1046/ j.1365-2648.2000.t01-1-01567.x.
[21] I. R. Diamond et al., โDefining consensus: A systematic review recommends methodologic criteria for reporting of Delphi studies,โ J. Clin. Epidemiol., vol. 67, no. 4, pp. 401โ409, Apr. 2014, doi: 10.1016/j.jclinepi.2013.12.002.
You may also like: