There’s a powerful image that has been coming to our minds lately when we think about our work as sports professionals. It's the image of being behind the wheel of a car on a mountain road. The path is uncertain, full of blind curves and crests that demand our full attention because we don’t know what’s on the other side, how the car will behave, or if we can maneuver quickly when faced with the unexpected. However, instead of fixing our gaze on the windshield, on what is to come, we obsess over looking in the rearview mirror. From that perspective, control is absolute. We know the road we’ve just left behind with great precision. The past is a clear, quantifiable, and orderly territory. But while we are engaged in this perfect analysis of what has already happened, we are unknowingly hurtling toward the next sharp curve. Will we be able to keep the car on the road?
This metaphor, so obvious on the road, seems to vanish with astonishing ease when we enter our field: the analysis of sports performance. We have internalized it to such an extent that we have forgotten to question our own processes. In recent years, we have given data great power, to the point of basing a large part of the decisions in high-performance sports on it. We do not intend to diminish its value in the slightest, as it is very useful, but it cannot be everything. This is mainly because the data we collect, just like what we see in the car's rearview mirror, is only a static picture of the past. It describes a part of the reality that was, provided we can say with some level of certainty that the data collected captures the essence of that reality under study.
Allow us to share an anecdote that has partly motivated this editorial. Not long ago, a high-level basketball scouting manager confessed the following to us: "Statistics in basketball are used to save you from watching games. But if you really want to know how a player or a team behaves, you have to sit down and watch the games." This seemingly obvious statement is devastating in its simplicity and largely describes the situation we find ourselves in. It is through the observation of the game that we can better understand its dynamics and thus delve into the study of futures, the next stimulus, the next drill, the next training session, or the next game.
This drift toward detail is well illustrated if we talk about biomedical research in the last century. Let's think about the shift from studying the functioning of complete physiological systems to analyzing the behavior of a single cell in a petri dish. This reductionist approach has been an undeniable engine of progress, allowing us to decipher fundamental mechanisms. This is not a criticism, but a recognition of a success. However, it has also created a monumental challenge: understanding how those truths discovered in the isolation of a laboratory translate back into the chaotic and interconnected ecosystem of the living organism. By focusing on a part, we run the risk of misinterpreting the whole. And that is exactly our crossroads.
Let's move the example to our field, to the analysis of data collected in load monitoring processes. We currently have technological solutions—positioning systems, inertial sensors, etc.—that provide us with a giant, high-definition rearview mirror, allowing us to see the curves we have already navigated in great detail. We can even zoom in almost infinitely to break down each of the curves into a thousand different pieces. We can observe centimeter by centimeter where the player was, what accelerations their trunk was subjected to, or how long a change of direction lasted. At this point, we are one step away from a dangerous trap. We have stopped seeing the curve; we have lost the perspective of the whole, of the context. We move on to analyzing the gesture with exquisite precision, treating it as an isolated event, forgetting that it is part of an adaptive system. This forces us to ask some questions: What is the real impact of that hyper-analyzed variable on the player's overall state? Will the change we propose in that small piece have the linear and predictable effect we expect on the whole? We are convinced that the analysis has value, but we must be cautious.
And that drift toward detail, that reductionism that has allowed us to understand the isolated piece, is precisely what feeds the illusion of linearity. By zooming in on a single variable—an acceleration, the distance covered at high speed, etc.—we fall into the temptation of believing that we have isolated the fundamental cause of performance or risk. We assume that if we modify that piece, the effect on the whole will be direct, predictable, and proportional.
We forget that in complex systems, the interactions between the components often matter more than the components themselves. By focusing on the curve, we not only lose sight of the road, but we forget that the state of the road itself modifies how we approach each curve and the driving experience itself. And here we have already run into the illusion of linearity. The person in charge of load management looks at the data and assumes, almost by inertia, that if a load X produced a result Y, a load X+1 will produce a predictably superior result. If we have done more today, the response is more intense; the more meters covered at high intensity in competition, the more time players will need to recover. This often leads us to shorter and less intense tasks. And what certainty do we have that this is more beneficial? Note: These analyses are generally performed using averages and standard deviations. Perhaps these are not the most appropriate statistics when we talk about living systems that change their response to apparently similar stimuli. But we will explore this in another editorial.
This is not exclusive to the field, as countless recognized scientific journals publish articles that correlate accumulated load with past injuries, but in essence, it is a forensic analysis. We study the black box of the plane after the crash. This forces us to ask ourselves an uncomfortable question: are we using this data to understand complexity or, in reality, to feel that we control it?
The fundamental problem with this approach, and the antidote to extreme reductionism, is to remember that athletes are not machines. They are complex and adaptive biological systems. To ignore this is to ignore the elephant in the room. The key to understanding it lies in concepts that should be the pillar of our practice. One of them is hormesis: the principle by which a stressor in the right dose provokes a response that strengthens the system, breaking any simple linear logic.
Let's think of two players who perform the exact same session. For one, who is well-rested and has high confidence, it is an optimal stimulus. For another, who is carrying mental fatigue and sleeping poorly, that same load can be the push towards injury. The external load recorded by the positioning systems will be identical; the system that receives it, and therefore the result, is radically different. The data on the screen does not tell us that story.
This leads us directly to the nature of the systems we work with, magnificently explored in works like Thinking in Systems by Donella Meadows or In a Flight of Starlings by Giorgio Parisi. Living systems, like an athlete or a team, have properties that defy our desire to simplify: