In probability theory all nonatomic
probability
measures look the same. That is
because any two nonatomic
separable measure algebras are isomorphic.
Quantum probability theory is different:
two normal states of B(H)
are conjugate only when the eigenvalue
lists of their density operators are the same.
Suppose now that one is given an increasing
sequence M_1, M_2, ....
of type I subfactors of B(H) whose
union is weak*-dense in B(H).
Common sense suggests that if one restricts
a normal state f of B(H) to
M_n and considers its eigenvalue list
L_n for large n, then L_n
should be close to the eigenvalue list of
f when n is large.
We discuss some natural examples which show
that this intuition is wrong, and we
attempt to explain the phenomemon by
describing the correct asymptotic formula
when the sequence M_n is ``stable".
Applications are not discussed here, but
are taken up in \cite{1}.