Purpose: To evaluate the relative importance and predictive ability of salivary immunoglobulin A (s-IgA) measures with regards to upper respiratory illness (URI) in youth athletes.
Methods: Over a 38-week period, 22 youth athletes (age = 16.8 [0.5] y) provided daily symptoms of URI and 15 fortnightly passive drool saliva samples, from which s-IgA concentration and secretion rate were measured. Kernel-smoothed bootstrapping generated a balanced data set with simulated data points. The random forest algorithm was used to evaluate the relative importance (RI) and predictive ability of s-IgA concentration and secretion rate with regards to URI symptoms present on the day of saliva sampling (URIday), within 2 weeks of sampling (URI2wk), and within 4 weeks of sampling (URI4wk).
Results: The percentage deviation from average healthy s-IgA concentration was the most important feature for URIday (median RI 1.74, interquartile range 1.41-2.07). The average healthy s-IgA secretion rate was the most important feature for URI4wk (median RI 0.94, interquartile range 0.79-1.13). No feature was clearly more important than any other when URI symptoms were identified within 2 weeks of sampling. The values for median area under the curve were 0.68, 0.63, and 0.65 for URIday, URI2wk, and URI4wk, respectively.
Conclusions: The RI values suggest that the percentage deviation from average healthy s-IgA concentration may be used to evaluate the short-term risk of URI, while the average healthy s-IgA secretion rate may be used to evaluate the long-term risk. However, the results show that neither s-IgA concentration nor secretion rate can be used to accurately predict URI onset within a 4-week window in youth athletes.
Keywords: adolescent; immune function; machine learning; monitoring.