Next Article in Journal
Amelioration of Cadmium-Induced Oxidative Damage in Wistar Rats by Vitamin C, Zinc and N-Acetylcysteine
Previous Article in Journal
A Meta-Analysis: Coronary Artery Calcium Score and COVID-19 Prognosis
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Review

Emerging Tools to Capture Self-Reported Acute and Chronic Pain Outcome in Children and Adolescents: A Literature Review

1
School of Health Sciences, College of Health, Medicine and Wellbeing, The University of Newcastle, Ourimbah 2258, Australia
2
School of Biomedical Sciences and Pharmacy, College of Health, Medicine and Wellbeing, The University of Newcastle, Ourimbah 2258, Australia
3
School of Health Sciences, Queen Margaret University, Edinburgh EH21 6UU, UK
4
Facultat de Ciències de la Salut de Manresa, Universitat de Vic-Universitat Central de Catalunya, 08242 Manresa, Spain
5
Priority Research Centre Health Behaviour, Hunter Medical Research Institute, Newcastle 2305, Australia
*
Author to whom correspondence should be addressed.
Submission received: 3 December 2021 / Revised: 13 January 2022 / Accepted: 20 January 2022 / Published: 25 January 2022
(This article belongs to the Section Translational Medicine)

Abstract

:
The advancement of digital health provides strategic and cost-effective opportunities for the progression of health care in children and adolescents. It is important for clinicians to be aware of the potential of emerging pain outcome measures and employ evidence-based tools capable of reliably tracking acute and chronic pain over time. The main emerging pain outcome measures for children and adolescents were examined. Overall, seven main texts and their corresponding digital health technologies were included in this study. The main findings indicated that the use of emerging digital health is able to reduce recall bias and can improve the real time paediatric data capture of acute and chronic symptoms. This literature review highlights new developments in pain management in children and adolescents and emphasizes the need for further research to be conducted on the use of emerging technologies in pain management. This may include larger scale, multicentre studies to further assess validity and reliability of these tools across various demographics. The privacy and security of mHealth data must also be carefully evaluated when choosing health applications that can be introduced into daily clinical settings.

1. Introduction

Easily accessible digital health solutions may provide strategic and cost-effective opportunities to acquire useful clinical data, even remotely, that could support health-care management in children and adolescents. Clinicians should be supported in accessing emerging-pain outcome measures and employ evidence-based tools capable of reliably tracking acute and chronic paediatric pain over time.
Pain is a complex and multifactorial phenomenon which can negatively impact a child’s health-related quality of life [1]. Pain outcome measures are commonly used to assess the severity of symptoms in children and adolescents [2]. Traditionally, symptom progression has been recorded using either the Wong Baker scale, Numeric Rating Scale, Verbal Rating Scale and Faces Pain Scale-Revised. These tools have been extensively validated and adopted in clinical settings to assess self-reported pain levels [3,4,5,6]. Symptom progression in children can often be misreported, due to the risk of under/overestimation by parents, carers or practitioners [7,8]. Alarmingly, the likelihood of a child receiving pharmacological pain management interventions increases if their pain level is incorrectly recorded [9]. Allied Health Professionals (AHP) require effective and unbiased self-reported pain management tools to support clinical pain management strategies [2].
Clinicians should be made aware of the limitations of the more traditional self-reported paper pain outcome measures that are still commonly used in different paediatric hospital settings [10]. These limitations are mostly based on self-reported paper pain scales being cumbersome, complex to use and possibly at risk of practitioner-interpretation error [11,12]. Despite these limitations, in their now dated trial, Stone et al. (2003) described the use of self-reported paper pain diaries as a valid method to regularly track symptoms, and at different times of the day with the compliance of diaries estimated to be very high (up to 90%) [13]. Compliance of self-reported paper pain diaries is much lower than originally estimated and increases the chance of participants backfilling data [13]. This limitation increases the risk of recall bias, reduces the accuracy of pain diaries and highlights the risk of incorrect and inconsistent use of certain self-reported pain tools.
Recently, modern technologies have emerged which utilise novel self-reported pain outcome tools, such as eOuch, SUPER-KIDZ and iPadVAS. Advances in technology through electronic self-reported pain recording has the potential to address the existing limitations of self-reported paper pain outcome measures by providing the opportunity to complete intermittently throughout the course of the day with constant access to electronic devices [12]. Pain and distress outcomes in children have been extensively assessed for methodological quality [14,15]; however, further research is required in this paediatric field to facilitate the introduction of new smart technologies into daily clinical practice. This could potentially increase compliance and accuracy in recording self-reported pain levels in children and adolescents. This literature review will discuss the advantages and limitations of emerging self-reported pain outcome tools and compare them to traditional pain outcome measures.

2. Materials and Methods

A search was conducted using both Medline, PubMed, Pedro and Psycho information from January 1990 to October 2021, to review current literature and to collate articles, which were assessed in terms of inclusion/exclusion criteria and quality of evidence.
The search was conducted using the following keywords: Pain outcome measure, pain assessment, pain measurement, children, adolescent, electronic, smartphone, smart device, smart-technology, validation, feasibility (Table 1). Inclusion/exclusion criteria for this literature review are presented in Table 2.
Two reviewers (A.T., A.F. and A.C.) independently screened the titles and abstracts of all studies identified by the search. Full-text articles of potentially eligible studies were retrieved by A.T. and A.F. and independently screened by A.F. and A.C. Authorship and results were not masked. Disagreements between the two authors regarding full-text inclusion were resolved by a third reviewer (D.S.). If disagreements were not resolved successfully by the third reviewer, study authors were to be contacted, although this was never required. A.T. and A.F. extracted data from the included studies using a standardized pilot-tested form, and a second author (A.C.) checked all extracted data. If there was any absent or uncertain information, study authors were contacted. Inconsistencies in data extraction were discussed between A.T., A.F. and A.C. and, if needed, through arbitration by D.S.

3. Results

Overall, 1062 papers were obtained from the conducted search and contributed to this literature review, of which 70 duplicates were found and removed. Overall, 992 titles and abstracts were screened, and 960 were excluded due to lack of relevance. A total of 19 articles met the criteria for inclusion in this study [16,17,18,19,20,21,22,23,24,25,26,27,28,29,30,31,32,33]. The main reason for rejecting papers was the age range inclusive of children (0–18), without being exclusive to this age range. Figure 1 presents the PRISMA flow diagram. A summary of the findings and limitations of emerging pain outcome measures is available in Table 3.

4. Discussion

Based on the obtained results, this discussion section outlines two distinct systems used to record acute and chronic pain progression:
(A)
Electronic Devices, such as PC or laptops, where data is stored on the internal memory and then instantaneously uploaded to the cloud when a network is available or moved to an external device.
(B)
Smart Devices capable of WiFi connectivity or linked to 3G or 4G mobile networks. These devices are typically cordless, highly portable, with the option to capture the location of service, enable interactive text and voice call recording with the ability to take photos and videos. For example, these may include, but are not limited to: a Smartphone (i.e.: iPhone), tablet (i.e.: iPAD, Samsung Galaxy) or smart watch (i.e.: Apple watch). The advantages and disadvantages of these emerging technologies tested in children and adolescents, will form the basis of the following discussion.

4.1. Electronic Devices Used to Capture Symptom and Pain Progression

4.1.1. Electronic Diaries

The use of electronic pain diaries has the potential to increase the compliance of recording data and may reduce recall bias compared to paper pain diaries (16). Palermo et al. (2004) investigated 66 children aged 8–16 with Juvenile Idiopathic Arthritis (JIA) or headaches. The paper diary group had significantly more omissions and errors compared to electronic pain diaries (which had none). Additionally, the electronic diaries had significantly more days completed over a week compared to paper pain diaries. Stinson et al. (2004) assessed a multidimensional electronic pain diary (eOuch) in terms of its real-time construct validity and feasibility. The study was conducted in an age range of 9–18. The methodological approach accounted for the fluctuations of active arthritis symptoms that unfortunately occur throughout the day in JIA; therefore, the patients were asked to record their pain level at different times (i.e.: morning, lunch-time, evening). This addressed a limitation of prior research by Palermo et al. (2004) [16], in which they only asked participants to measure symptoms at the end of the day.

4.1.2. PDA

Wood et al. (2011) concluded that there was a reduced recall bias through the introduction of an electronic PDA with the faces pain scale compared to paper [22]. The study recruited 202 children, 4–12 years of age. Interestingly, results indicated that they found similar pain scores on both measures, and the mean absolute discrepancy was not statistically different.

4.1.3. Web-Based Multidimensional Pain Measure

Luca et al. (2017) assessed a web-based multidimensional pain measure for children and youth with JIA [23]. This included 71 participants aged 8–18 and 29 parent–child dyads aged 4–7. Findings from this study concluded that SUPER-KIDZ has good internal consistency, responsiveness and satisfactory test–retest reliability [23]. All of these tools present various advances compared to traditional paper based outcomes.

4.1.4. Pain Measurement Tools Comparison

There are several limitations identified in these studies that need to be included. Firstly, Palermo et al. (2004) did not address changes in pain throughout the day. Stinson et al. (2008) had high compliance; however, 22% of the data was missing, which may lead to a biased estimate of average weekly electronic pain diaries. Wood et al. (2011) included a cohort of chronically ill and healthy children attending day surgery within the hospital outpatient setting. It could be argued that the non-homogeneity of the observed groups could have significantly impacted the overall results, especially when dealing with chronically ill paediatric patients with complex medical and pharmacological history. The time between administration of the paper FPS-R and PDA FPS -R was limited to any time frame that was less than 30 min. As a consequence, multiple results were excluded from the study due to the time between measures being less than 1 min. It is possible that the lack of standardisation of the time frame between administration of self-reported tools impacted the overall findings. Nurses were asked to extensively explain how to use the PDA version of the scale, and it is possible that there was preferential bias associated with the PDA faces pain scale. Alarmingly, due to different French hospital legislations, no ethics approval was obtained in order to conduct this paediatric research.
Finally, in comparison to paper-based traditional outcome measures, the electronic pain diaries clearly showed the following: reduced recall bias, the ability to measure pain at different times of the day, and a reduction in the amount of errors in pain diaries compared to paper based pain diaries. Web-based tools were able to present an array of outcome measures within the web-based program, which are validated across a relatively large age range in children (ages 4–18).

4.2. Smart Technology to Monitor Symptom and Pain Progression

Sanchez-Rodriguez et al. (2015) assessed the validity and agreement of intensity reports of an application (the Painometer) that had four different pain intensity scales, against their traditional counterparts. Overall, 180 participants were recruited from ages 12–19. The results from this study showed that the scales were highly interchangeable, with a confidence interval of 80%. Stinson et al. (2015) assessed the reliability of a multidimensional smart phone app in children and adolescents with cancer [19]. The study consisted of two components to determine validity, reliability and responsiveness. Authors concluded that real-time data collection had the potential to reduce recall bias and improve the understanding of associated symptoms in paediatric cancer patients. Sun et al. (2015) investigated the agreement between their application, called ‘Panda’ and original paper/plastic versions of the FPS-R and CAS to determine children’s preference between each of the scales [27]. Sun et al. (2015) reported a significant preference (p < 0.005) for their application compared to the use of their traditional paper counterparts. This study involved 62 participants and provided insightful evidence into a younger cohort of participants and the use of smart technologies to track pain progression.
There are multiple limitations to these studies that should be considered. Sanchez-Rodriguez et al. (2015) included multiple outcome measures; however, they did not randomise the presentation order of the scales. This has been previously shown to not influence the obtained rating [3]; however, it may influence results especially when gathering data with children. The authors also describe the time (30 min) between two assessments as not being adequate to prevent memory effects. Additionally, due to the sample population, participants were asked to recall maximum pain intensity in the last three months. This may present some issues in terms of reliability measures. Stinson et al. (2015) described a limitation in capturing symptom progression at certain times of the day (morning and night) compared to capturing symptom progression on an individual basis (i.e., when a participant exhibits pain). This may have led to omitting fluctuations in pain during the day and may have led to potential bias. Furthermore, study two, which assessed responsiveness and feasibility, was unable to successfully recruit the number of the required participants. Sun et al. (2015) failed to assess the reliability of their application. Results found a systematic bias towards the smartphone application specifically with regard to the limits of agreement for clinical significance, the cause for this was not discussed. Additionally, 27% of all pain assessments of FPS-R were of a value of 0. This indicates that further research is needed to assess a wider range of pain intensities using this application, due to a possible end-point bias. Additionally, as reliability was not determined, these values may not have been entirely accurate.
Turnbull et al. (2020) and Martinez Garcia et al. (2020) used InteractiveClinics App and the PainApple App, respectively to monitor symptoms using validated technologies. Results showed moderate to good ICC amongst healthy children and adolescents when interchanging the eVAS compared to the traditional pVAS. Instead, Martinez et al. (2020) measured pain and other post-operative outcomes at 30 min intervals. The PainAPPle appeared to be a valid instrument to assess the management of acute pain in pediatric patients.

4.3. Electronic and Smart Technology Compared to Traditional Outcome Measures

4.3.1. Clinical Implications and Considerations

The research evidence regarding the use of emerging outcome measures is broad and provides promising developments for clinicians to accurately measure and monitor pain and symptom progression (Table 2). Modern technologies have the potential to capture real time data and reduce recall bias, aiming to improve the reliability of clinically tracking symptom progression in children and adolescents. Clinicians should be made aware that most of the studies conducted so far were not multicentered and ranged across various paediatric demographics (hospitalised children, or healthy children). Thus, it is difficult to establish the transferability of these results across various paediatric clinical settings. Additionally, there is limited evidence to demonstrate the feasibility, usability and responsiveness of these applications. Whilst most studies asked a child to indicate a preference for each device [26,27,34], speech recording devices and large scale feasibility trials have not yet been performed on the use of smart technologies in monitoring pain.
The results of this review also highlight the considerable effort invested in developing smart technologies in the field of paediatric rheumatology. These promising advancements, specifically in JIA research, demonstrated how the eOuch (Stinson et al., 2014) and the JIApp (Cai et al., 2017), provide validated innovation, able to acquire pain assessment data that may be used to effectively informed clinical decision making and raise patient’s awareness of their condition.

4.3.2. Data Privacy

A prevalent issue involving the use of smartphone applications is the privacy and security of data collected through these platforms. Often, users of smartphone applications are unaware of how their data is used and managed [35]. Recent evidence suggests that there are concerns over the lack of privacy policies included in smartphone applications used for certain diseases such as diabetes and dementia [36,37]. Considering the sensitive nature of chronic pain conditions, data privacy and security should be taken into account when making the transition into emerging tools to monitor symptom progression [38].

4.4. Digital Health and Clinical Limitations

Considerations that should be taken into account in the future is the issue that smart technologies may not be easily affordable by all children and their parents/carers. The financial disadvantage is identified as a major hurdle in electronic devices to measure symptom progression [22]. The use of computers, laptops, PDAs and electronic pain diaries does represent an extra financial burden for the families and may exclude a child or adolescent from accessing the most suitable technology to track pain. Nevertheless, positive evidence highlights that 67% of primary school aged children, and surprisingly 36% of preschool aged children already own a mobile based screen device [39]. By supporting mobile based applications that can be downloadable directly onto an existing smartphone, it may present with a significant saving solution for families.
Excessive screen time presents a possible health issue for children and adolescents [38]. A 2020 systematic review reported a concerning trend of correlation between time spent by children using screens and developing of myopia [40]. Whilst the effects of excessive screen usage is unclear [40], other authors still recommend regular parental supervision in order to minimise these potential side effects [38].
Future research should focus on addressing the limitations outlined in Section 4.4. Detailed usability and acceptability trials should be conducted to test these smart technologies prior to their introduction to young symptomatic patients. Larger powered clinical trial and systematic reviews are urgently required to validate the different emerging digital health technologies to support acute and chronic pain management in children and adolescent. A cost-effectiveness evaluation may also need to be considered to determine if digital health is capable of reducing the overall cost to the health care system. Finally, researchers and the industry need to work closely together to promote the highest standards of data privacy, whilst using country-based secure servers to store sensitive data.

5. Conclusions

Emerging technologies may have the potential to improve the methods by which Allied Health Professionals monitor symptom progression amongst children and adolescents with the capacity to improve the responsiveness in their clinical management. Growing evidence indicates how the recently developed mHealth Apps, designed to record pain intensity scales, are highly interchangeable compared to their traditional paper versions. More research is required to further investigate new reliable tools capable of recording pain. In particular, larger, multicentre randomised controlled trials are needed to consolidate the use of smart technologies in pain management in children and adolescents. The feasibility and responsiveness of these mHealth tools need to be carefully studied with particular focus on the possible financial impact and savings for the health care system. In conclusion, it is of paramount importance that the privacy and security of mHealth data should be carefully considered when choosing health applications that can be introduced into daily clinical settings.

Author Contributions

Conceptualization, A.T., D.S. (Dean Sculley), D.S. (Derek Santos), M.M., L.C., X.G., A.F. and A.C.; methodology, A.T., D.S. (Dean Sculley), D.S. (Derek Santos), X.G., A.F. and A.C.; formal analysis, A.T., D.S. (Dean Sculley), D.S. (Derek Santos), M.M., L.C., A.F. and A.C.; writing—original draft preparation, A.T., D.S. (Dean Sculley), D.S. (Derek Santos), M.M., L.C., A.F. and A.C.; writing—review and editing, A.T., D.S. (Dean Sculley), D.S. (Derek Santos), M.M., L.C., X.G., A.F. and A.C.; supervision, D.S. (Dean Sculley), D.S. (Derek Santos), X.G. and A.C. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

Data available on request due to ethical restrictions. The data presented in this study are available on request from the corresponding author. The data on this review are not publicly available due to ethical reasons.

Acknowledgments

Our team would like to acknowledge the Librarians at the University of Newcastle for their support, availability and expertise.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Rabbitts, J.A.; Holley, A.L.; Groenewald, C.B.; Palermo, T.M. Association between widespread pain scores and functional impairment and health-related quality of life in clinical samples of children. J. Pain 2016, 17, 678–684. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  2. Hersh, A.O.; Salimian, P.K.; Weitzman, E.R. Using patient-reported outcome measures to capture the patient’s voice in research and care of juvenile idiopathic arthritis. Rheum. Dis. Clin. 2016, 42, 333–346. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  3. Miró, J.; Castarlenas, E.; Huguet, A. Evidence for the use of a numerical rating scale to assess the intensity of pediatric pain. Eur. J. Pain 2009, 13, 1089–1095. [Google Scholar] [CrossRef] [PubMed]
  4. Tsze, D.S.; von Baeyer, C.L.; Bulloch, B.; Dayan, P.S. Validation of self-report pain scales in children. Pediatrics 2013, 132, e971–e979. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  5. Kashikar-Zuck, S.; Carle, A.; Barnett, K.; Goldschneider, K.R.; Sherry, D.D.; Mara, C.A.; Cunningham, N.; Farrell, J.; Tress, J.; DeWitt, E.M. Longitudinal evaluation of patient-reported outcomes measurement information systems measures in pediatric chronic pain. Pain 2016, 157, 339–347. [Google Scholar] [CrossRef] [Green Version]
  6. Garra, G.; Singer, A.J.; Taira, B.R.; Chohan, J.; Cardoz, H.; Chisena, E.; Thode, H.C., Jr. Validation of the Wong-Baker FACES pain rating scale in pediatric emergency department patients. Acad. Emerg. Med. 2010, 17, 50–54. [Google Scholar] [CrossRef]
  7. Cohen, L.L.; Vowles, K.E.; Eccleston, C. Adolescent chronic pain-related functioning: Concordance and discordance of mother-proxy and self-report ratings. Eur. J. Pain 2010, 14, 882–886. [Google Scholar] [CrossRef]
  8. Kamper, S.J.; Dissing, K.B.; Hestbaek, L. Whose pain is it anyway? Comparability of pain reports from children and their parents. Chiropr. Man. Ther. 2016, 24, 24. [Google Scholar] [CrossRef] [Green Version]
  9. Drendel, A.L.; Brousseau, D.C.; Gorelick, M.H. Pain assessment for pediatric patients in the emergency department. Pediatrics 2006, 117, 1511–1518. [Google Scholar] [CrossRef]
  10. Zisk-Rony, R.Y.; Lev, J.; Haviv, H. Nurses’ report of in-hospital pediatric pain assessment: Examining challenges and perspectives. Pain Manag. Nurs. 2015, 16, 112–120. [Google Scholar] [CrossRef]
  11. Quinn, B.L.; Sheldon, L.K.; Cooley, M.E. Pediatric pain assessment by drawn faces scales: A review. Pain Manag. Nurs. 2014, 15, 909–918. [Google Scholar] [CrossRef] [PubMed]
  12. Bird, M.-L.; Callisaya, M.L.; Cannell, J.; Gibbons, T.; Smith, S.T.; Ahuja, K.D. Accuracy, validity, and reliability of an electronic visual analog scale for pain on a touch screen tablet in healthy older adults: A clinical trial. Interact. J. Med. Res. 2016, 5, e4910. [Google Scholar] [CrossRef]
  13. Stone, A.A.; Shiffman, S.; Schwartz, J.E.; Broderick, J.E.; Hufford, M.R. Patient compliance with paper and electronic diaries. Control. Clin. Trials 2003, 24, 182–199. [Google Scholar] [CrossRef]
  14. Oliveira, N.; Gaspardo, C.; Linhares, M. Pain and distress outcomes in infants and children: A systematic review. Braz. J. Med. Biol. Res. 2017, 50. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  15. Manworren, R.C.; Stinson, J. (Eds.) Pediatric Pain Measurement, Assessment, and Evaluation. In Seminars in Pediatric Neurology; Elsevier: Amsterdam, The Netherlands, 2016. [Google Scholar]
  16. Palermo, T.M.; Valenzuela, D.; Stork, P.P. A randomized trial of electronic versus paper pain diaries in children: Impact on compliance, accuracy, and acceptability. Pain 2004, 107, 213–219. [Google Scholar] [CrossRef] [PubMed]
  17. Fanciullo, G.J.; Cravero, J.P.; Mudge, B.O.; McHugo, G.J.; Baird, J.C. Development of a new computer method to assess children’s pain. Pain Med. 2007, 8 (Suppl. S3), S121–S128. [Google Scholar] [CrossRef] [PubMed]
  18. Stinson, J.N.; Stevens, B.J.; Feldman, B.M.; Streiner, D.; McGrath, P.J.; Dupuis, A.; Gill, N.; Petroz, G.C. Construct validity of a multidimensional electronic pain diary for adolescents with arthritis. Pain 2008, 136, 281–292. [Google Scholar] [CrossRef]
  19. Stinson, J.N.; Petroz, G.C.; Stevens, B.J.; Feldman, B.M.; Streiner, D.; McGrath, P.J.; Gill, N. Working out the kinks: Testing the feasibility of an electronic pain diary for adolescents with arthritis. Pain Res. Manag. 2008, 13, 375–382. [Google Scholar] [CrossRef] [Green Version]
  20. Cravero, J.P.; Fanciullo, G.J.; McHugo, G.J.; Baird, J.C. The validity of the Computer Face Scale for measuring pediatric pain and mood. Pediatric Anesth. 2013, 23, 156–161. [Google Scholar] [CrossRef]
  21. Stinson, J.N.; Jibb, L.A.; Lalloo, C.; Feldman, B.M.; McGrath, P.J.; Petroz, G.C.; Streiner, D.; Dupuis, A.; Gill, N.; Stevens, B. Comparison of average weekly pain using recalled paper and momentary assessment electronic diary reports in children with arthritis. Clin. J. Pain 2014, 30, 1044–1050. [Google Scholar] [CrossRef] [Green Version]
  22. Wood, C.; von Baeyer, C.L.; Falinower, S.; Moyse, D.; Annequin, D.; Legout, V. Electronic and paper versions of a faces pain intensity scale: Concordance and preference in hospitalized children. BMC Pediatr. 2011, 11, 87. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  23. Luca, N.J.; Stinson, J.N.; Feldman, B.M.; Benseler, S.M.; Beaton, D.; Campillo, S.; LeBlanc, C.; van Wyk, M.; Bayoumi, A.M. Validation of the standardized universal pain evaluations for rheumatology providers for children and youth (SUPER-KIDZ). J. Orthop. Sports Phys. Ther. 2017, 47, 731–740. [Google Scholar] [CrossRef] [PubMed]
  24. Jordan, A.; Begen, F.M.; Austin, L.; Edwards, R.T.; Connell, H. A usability and feasibility study of a computerized version of the Bath Adolescent Pain Questionnaire: The BAPQ-C. BMC Pediatr. 2020, 20, 6. [Google Scholar] [CrossRef] [PubMed]
  25. Stinson, J.N.; Jibb, L.; Nguyen, C.; Nathan, P.C.; Maloney, A.M.; Dupuis, L.; Gerstle, J.T.; Alman, B.; Hopyan, S.; Strahlendorf, C.; et al. Development and testing of a multidimensional iPhone pain assessment application for adolescents with cancer. J. Med. Internet Res. 2013, 15, e51. [Google Scholar] [CrossRef] [PubMed]
  26. Stinson, J.N.; Jibb, L.A.; Nguyen, C.; Nathan, P.C.; Maloney, A.M.; Dupuis, L.L.; Gerstle, J.T.; Hopyan, S.; Alman, B.A.; Strahlendorf, C.; et al. Construct validity and reliability of a real-time multidimensional smartphone app to assess pain in children and adolescents with cancer. Pain 2015, 156, 2607–2615. [Google Scholar] [CrossRef]
  27. Sun, T.; West, N.; Ansermino, J.M.; Montgomery, C.J.; Myers, D.; Dunsmuir, D.; Lauder, G.; Von Baeyer, C.L. A smartphone version of the Faces Pain Scale-Revised and the Color Analog Scale for postoperative pain assessment in children. Pediatric Anesth. 2015, 25, 1264–1273. [Google Scholar] [CrossRef]
  28. Cai, R.A.; Beste, D.; Chaplin, H.; Varakliotis, S.; Suffield, L.; Josephs, F.; Sen, D.; Wedderburn, L.R.; Ioannou, Y.; Hailes, S.; et al. Developing and evaluating JIApp: Acceptability and usability of a smartphone app system to improve self-management in young people with juvenile idiopathic arthritis. JMIR Mhealth Uhealth 2017, 5, e7229. [Google Scholar] [CrossRef] [Green Version]
  29. Jibb, L.A.; Stevens, B.J.; Nathan, P.C.; Seto, E.; Cafazzo, J.A.; Johnston, D.L.; Hum, V.; Stinson, J.N. Implementation and preliminary effectiveness of a real-time pain management smartphone app for adolescents with cancer: A multicenter pilot clinical study. Pediatric Blood Cancer 2017, 64, e26554. [Google Scholar] [CrossRef]
  30. Birnie, K.A.; Campbell, F.; Nguyen, C.; Lalloo, C.; Tsimicalis, A.; Matava, C.; Cafazzo, J.; Stinson, J. iCanCope PostOp: User-centered design of a smartphone-based app for self-management of postoperative pain in children and adolescents. JMIR Form. Res. 2019, 3, e12028. [Google Scholar] [CrossRef]
  31. Turnbull, A.; Sculley, D.; Escalona-Marfil, C.; Riu-Gispert, L.; Ruiz-Moreno, J.; Gironès, X.; Coda, A. Comparison of a Mobile Health Electronic Visual Analog Scale App With a Traditional Paper Visual Analog Scale for Pain Evaluation: Cross-Sectional Observational Study. J. Med. Internet Res. 2020, 22, e18284. [Google Scholar] [CrossRef]
  32. Martinez Garcia, E.; Catalan Escudero, P.; Mateos Arroyo, J.; Ramos Luengo, A.; Sanchez Alonso, F.; Reinoso Barbero FPainapple, R. Validation and evaluation of an electronic application for the management of acute pain in pediatric patients. Rev. Española Anestesiol. Reanim. 2020, 67, 139–146. [Google Scholar] [CrossRef]
  33. Lalloo, C.; Harris, L.R.; Hundert, A.S.; Berard, R.; Cafazzo, J.; Connelly, M.; Feldman, B.M.; Houghton, K.; Huber, A.; Laxer, R.M.; et al. The iCanCope pain self-management application for adolescents with juvenile idiopathic arthritis: A pilot randomized controlled trial. Rheumatology 2021, 60, 196–206. [Google Scholar] [CrossRef] [PubMed]
  34. Sanchez-Rodriguez, E.; de la Vega, R.; Castarlenas, E.; Roset, R.; Miro, J. AN APP for the Assessment of Pain Intensity: Validity Properties and Agreement of Pain Reports When Used with Young People. Pain Med. 2015, 16, 1982–1992. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  35. Hussain, M.; Al-Haiqi, A.; Zaidan, A.; Zaidan, B.; Kiah, M.; Iqbal, S.; Abdulnabi, M. A security framework for mHealth apps on Android platform. Comput. Secur. 2018, 75, 191–217. [Google Scholar] [CrossRef]
  36. Blenner, S.R.; Köllmer, M.; Rouse, A.J.; Daneshvar, N.; Williams, C.; Andrews, L.B. Privacy policies of android diabetes apps and sharing of health information. Jama 2016, 315, 1051–1052. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  37. Rosenfeld, L.; Torous, J.; Vahia, I.V. Data security and privacy in apps for dementia: An analysis of existing privacy policies. Am. J. Geriatr. Psychiatry 2017, 25, 873–877. [Google Scholar] [CrossRef]
  38. Thornton, S. Mobile phone use in children and young people: A public health concern? Br. J. Sch. Nurs. 2018, 13, 94–97. [Google Scholar] [CrossRef]
  39. Chiu, C.-T.; Chang, Y.-H.; Chen, C.-C.; Ko, M.-C.; Li, C.-Y. Mobile phone use and health symptoms in children. J. Formos. Med. Assoc. 2015, 114, 598–604. [Google Scholar] [CrossRef]
  40. Lanca, C.; Saw, S.M. The association between digital screen time and myopia: A systematic review. Ophthalmic Physiol. Opt. 2020, 40, 216–229. [Google Scholar] [CrossRef] [Green Version]
Figure 1. PRIMA flow chart.
Figure 1. PRIMA flow chart.
Medsci 10 00006 g001
Table 1. Search Strategy.
Table 1. Search Strategy.
1Pain outcome measure
2Pain assess *
3Pain measurem *
41 or 2 or 3
5Child *
6Adolescent.tw
75 or 6
8Electronic *
9Smartphone
10Smart device
11Smart-technolog *
128 or 9 or 10 or 11
13Validation
14Feasibility
1513 or 14
164 and 7 and 12 and 15
Table 2. Inclusion exclusion criteria.
Table 2. Inclusion exclusion criteria.
Inclusion Criteria:Exclusion Criteria:
Articles that do not explicitly state the exclusion of participants that have:
  • English language
  • Neurological disorders such as sensory processing disorders, or intellectual disabilities that impede the participants’ perception of pain.
  • Ages 0–18
  • A physical disability that impairs the ability to carry out self-reported pain scales i.e., visual impairment.
  • Peer reviewed articles
  • Editorials will not be included
  • Conference abstracts) are eligible only if we can identify the full-text report.
  • Study protocols for future or ongoing evaluations to capture self-reported acute and chronic pain outcome in children and adolescents based interventions.
Table 3. Summary of findings and limitations of emerging pain outcome measures included.
Table 3. Summary of findings and limitations of emerging pain outcome measures included.
NameStudyMethodology and SampleFindingsLimitations
Electronic pain outcome measures
Electronic pain diaryPalermo et al., 2004Randomized clinical trial.
n = 60, age range 8–16 mean=12.3
Headaches or Juvenile Idiopathic Arthritis
Children with e-diaries completed more days compared to p-diaries. P-diaries more errors.Only included pain tracking at the end of the day, not multiple times of the day
Computer Face ScaleFanciullo et al., 2007Cohort observational study evaluating feasibility of the computerised version of the Wong Baker Face scale. 54 in-patient children with mean age of 10.7; and a second convenience sample of 30 childrens with mean age of 7.2. Each sample used to test two objectives.Authors reported that the majority (76%) of participants preferred to use the computer version over paper. Moreover, authors showed that children were able to show varying levels of emotion when expressing pain levels.Data only collected at one time point and timeframe was not explicit. No control group. Difficult to quantitatively compare with other scales that use 0–10 numerical values.
eOuchStinson et al., 2008Descriptive study design. 13 adolescents with Juvenile Idiopathic Arthritis (JIA).Participants required to complete eOuch 3 times per day. Most participants reported the ediary was easy to use. Phase 1 of study had 73% compliance and phase 2 had 70%. Small sample size and sample only from one tertiary pediatric centre. Same patients were used in both usability and acceptability studies.
eOuchStinson et al., 2008A descriptive study design.
Study 1: n = 76, age range 9–17, mean 13.4
Study 2: n = 36, age range 8–17, mean = 12.6
Rheumatology clinics
Data was collected by the children. Evidence of construct validity and feasibility of eOuch pain diary in adolescents with JIA. Provided more information (3 times a day) compared to Palermo et al.22% of data was missing potentially leading to a biased estimate of average weekly electronic pain ratings.
personal data assistants (PDA) FPSWood et al., 2011Observational, multicenter, randomized, cross-over, controlled, open trial.
n = 202, age range 4–12.
Data was collected by hospitalised children. Mean levels of pain scores were 3.1 ± 2.3 and 3.2 ± 2.3 for paper and PDA scores, respectively.
The mean absolute discrepancy between the two versions was not statistically different significant from zero
Participants from multiple wards—chronic disease and also day surgery. None of the studies had a mean time between the assessments of “less than 30 min”
Computer Face ScaleCravero et al., 2013Validation study of the Computer Face Scale. Included 40 children aged 5–13 who underwent a tonsillectomy at Children’s Hospital at Dartmouth-Hitchcock Medical Center. Participants used a Dell Mobile PDA to display the face scale and arrows were used to cycle between each expression.When comparing CFS to the verbal rating scale and wong baker scale, authors concluded good validity scores: “The correlation between the pain ratings from the Computer Face Scale and the Wong-Baker Faces Scale after surgery was 0.83”. Sample population was limited to those undergoing surgery and therefore we were unable to generalise to other populations. No mention of CFS available for use on mobile phones and/or app stores across multiple devices.
eOuchStinson et al., 2014Construct validity study in children with JIA. Comparing momentary and recalled pain measurements with eOuch. 70 adolescent JIA participants.Between-person momentary and recalled pain measurements showed a moderate Interclass Correlation Coeficient (ICC). Within-person measurements displayed weak ICC. Sample sourced from one clinic. Study did not include a practice session. Weekly momentary analysis may have been influenced by 22% missing data reported.
SUPER-KIDZLuca et al., 2017Clinimetric study using prospectively collected repeated measures
Study 1: n = 71, age range 8–18
Study 2: n = 29 (parent child dyads), age range 4–7
For study 1, data was collected by the children Good internal consistency, responsiveness and satisfactory test–retest reliability.Small sample size
Reliability study compromised due to unstable pain levels
BAPQ-CJordan et al., 2020Fourteen adolescents with chronic pain (13 females; 13–16 years) were recruited from a hospital-based residential pain management programme. Qualitative study focusing on exploring the feasbility of the electronic version of Bath Adolescent Pain Questionnaire. Authors reported high acceptability of the BAPQ-C. 93% of participants reported that the BAPQ-C was both ‘quicker’ and ‘easier’ to complete than the BAPQ. Only one participant preferred the paper version. Small and specific sample population. Further validation required in patients outside hospital residential care.
Smart Technology Pain Outcome Measures
Pain SquadStinson et al., 2013Usability, feasibility, compliance, and satisfaction study. Qualitative interviews followed by compliance and satisfaction data were obtained. 15 adolescents with cancer with average age of 13.Participants during interviews provided feedback on the app and authors made adjustments accordingly. “88% of questions were rated as “important” or “very important by the majority (> 50%) of adolescents”. Compliance was high with mean of 81%. Authors also reported high satisfaction among the sample.No direct examination of high compliance rates. App only accessed with iPhone or other apple devices. Information from interviewed participants was not verified with a follow-up interview.
PainometerSanchez Rodriguez et al., 2015Cross-Sectional Observational Study
n = 180, age range 12–19 mean = 14.8
4 updated traditional outcome measures updated to smart devices
Data was collected by the children. 80% confidence interval—determined that they were interchangeableAsked participants to remember maximum pain over last 3 months. Presentation order of scales was not randomised
Pain SquadStinson et al., 2015A prospective descriptive study design with repeated measures was used to test the construct validity, reliability, and feasibility.
Study 1: n = 92, age range 8–18, mean 13.1.
Study 2: n = 14, age range 9–18, mean 14.8
Pediatric cancer patients
Data was collected by the children Found that the multidimensional app was valid, reliable and feasible within a pediatric cancer setting Small sample size in study 2—not enough participants recruited
Measures collected morning and evening—may have missed other fluctuations leading to bias
Smartphone FPS-R and CASSun et al., 2015observational, randomized, cross-over-controlled, open trial.
Study 1: n = 62, age range 4–12, mean 7.5
Study 2: n = 66, age range 5–18 mean 13
Children scheduled for surgery with anticipated post operative pain
Data was collected by the children Panda correlated strongly with original scores. Mean pain scores higher in application compared to original tool—systematic bias, within clinical significance (80%)Not multi dimensionsional—27% of scores were 0 on the FPS-R. Did not assess reliability of application. One sample location.
JIAppCai et al., 2017Design, develop, and evaluate the acceptability and usability of JIApp. 3 phase study on children with JIA. Participants ranged from ages 10–24 across the phased study. Three themes: (1) Remote monitoring; (2) Treatment adherence; (3) Education and Support.Ability for patients with JIA to report and monitor several parameters associated with their disease including but not limited to pain, joint symptoms, psychological well-being, activity limitation.
Young JIA patients reported a mean acceptability rating of 4.29 and expressed multiple benefits of the app.
Limited sample size. Will require further validation in a larger clinical trial.
Pain Squad+Jibb et al., 2017An cohort prospective design of adolescents ranging from 12–18 who were currently undergoing cancer treatment. 40 participants were recruited with a mean age of 14.2. Overall adherence of Pain Squad+ was 77.2%. Acceptability e-scaled showed a minimum average of 3 in all items assessed indicating satisfactory acceptability. Single group design. No control group. Pilot study and therefore requires further investigation on a larger sample size.
iCanCope PostOp app Birnie et al., 2019User-centered design study with 2 principle phases. (1) Semi-structured interviews, (2) 2-stage Delphi Survey. 19 children with mean age of 15.26 who underwent surgery within a 7-day period were recruited.
iCanCope: a smartphone-based app for children and adolescents’ self-management of acute postoperative pain.
All participants reported the three proposed features of the app as important (pain tracking, pain advice, and goal setting). Multiple features were proposed by participants, parents and health care workers. These include but are not limited to: Pain advice within the app; goal setting; direct communication with health care providers and medication tracking were also proposed.Convenience sample. Potentially limited by a lack of comprehensiveness of all types of surgeries for potential end users.
Interactive Clinics AppTurnbull et al., 2020Cross-Sectional Observational Study. 47 children and adolescents (mean age 13.9 years, SD 2.89 years; range 10–18 years).Authors concluded moderate to good ICC when interchanging the eVAS and pVAS. Convenience sample. Possibly lower reliability in children/adolescent sample due to differences in scale sizes fo VAS measuring line.
PainAPPle®Martínez García et al., 2020Descriptive cohort study of 44 paediatric patients post surgery. Mean age = 11.3.Data were collected by children after they recovered from their anaesthetic post-surgery. PainAPPle was used at 30 min intervals to measure pain and other post-operative outcomes. Statistically significant correlations were produced when comparing the electronic and paper versions of PainAPPle.Specific population sample. Not generalisable. Requires further testing for validation.
iCanCope appLalloo et al., 2021Feasibility and pilot RCT for the iCanCope app in adolescents with JIA. 60 adolescents with JIA recruited and randomised to a control or trial intervention. Mean age = 15.0. Trial invervention/condition was the iCanCope app + self management features. The control group only received the iCanCope app (no self-management).Both study conditions were deployed with high success. Pain intensity improved in both groups by 1.73 (intervention) and 1.09 (control). No significant changes in quality of life or pain-related activity limitations.
Overall, the app was adhered to well and acceptable to most adolescent JIA patients with pain.
Requires a third arm (with just usual care, i.e., no app) to assess the effectiveness of iCanCope on outcomes in children with JIA.
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Turnbull, A.; Sculley, D.; Santos, D.; Maarj, M.; Chapple, L.; Gironès, X.; Fellas, A.; Coda, A. Emerging Tools to Capture Self-Reported Acute and Chronic Pain Outcome in Children and Adolescents: A Literature Review. Med. Sci. 2022, 10, 6. https://0-doi-org.brum.beds.ac.uk/10.3390/medsci10010006

AMA Style

Turnbull A, Sculley D, Santos D, Maarj M, Chapple L, Gironès X, Fellas A, Coda A. Emerging Tools to Capture Self-Reported Acute and Chronic Pain Outcome in Children and Adolescents: A Literature Review. Medical Sciences. 2022; 10(1):6. https://0-doi-org.brum.beds.ac.uk/10.3390/medsci10010006

Chicago/Turabian Style

Turnbull, Alexandra, Dean Sculley, Derek Santos, Mohammed Maarj, Lachlan Chapple, Xavier Gironès, Antoni Fellas, and Andrea Coda. 2022. "Emerging Tools to Capture Self-Reported Acute and Chronic Pain Outcome in Children and Adolescents: A Literature Review" Medical Sciences 10, no. 1: 6. https://0-doi-org.brum.beds.ac.uk/10.3390/medsci10010006

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop