Random variables, it would seem, arise in two contexts. According to quantum mechanics, the collapse of wave function produces truly random outcomes, for which recall that Bell’s inequality disallows hidden variable theory. Let us call for convenience that quantum systems are first-kind-random.
But what is the nature of macroscopic random events? Let us say we roll a die. The angle by which the die rotates, the slightest perturbation of dust, the rough surface it lands on, are all random variables, and they are so random that the point of die observes uniform distribution. Still, it is possible to consider the system as completely deterministic. Classical mechanics describes the evolution of system in the phase space. Chaos theory suggests that deterministic systems may be able to produce seemingly random results. Let us call them second-kind-random.
It is then inevitable to conclude that, strangely, somehow the first-kind-randomness generates the second-kind-randomness. The reason might be the central limit theorem. Also, the (strong) law of large numbers follows almost surely.
On the other hand, the second-kind-randomness in turn simulates the first-kind-randomness. As an example, the distribution of prime numbers is deterministic, yet they resemble a normal distribution (or converge in distribution). In fact, this is what I think mathematics does. The probability theory—classical or measure theoretical—does not rule out a completely static interpretation. The event space is just a static collection of outcomes. The expectation is just an integration, be it Lebesgue, Riemann, or other definition. The theory of probability is no doubt rigorous in logic, but the very core concept that lies in it—the probability—lies only in our heart. We have defined everything in a very sound manner, except the most crucial one: what randomness is. Maybe we shall never know. Maybe they are something we see everyday, but cannot be said.