I'm reading about continuous time Markov processes, from Adventures in Stochastic Processes: The Random World of Happy Harry by Sidney Resnick. This book has some of the most amusing problems I've come across in a while; many of the problems involve a character known as "Happy Harry", who owns a greasy spoon near some university that occasionally reveals itself to be Cornell. This post does not involve one of those problems.
A discrete-time Markov chain is what people usually mean when they say "Markov chain". This is a sequence of elements X0, X1, X2, ... selected from some state space S = (s1, s2, ...), where
P(Xn = sj | Xn-1 = si) = pij
(the pij are called transition probabilities) and Xn does not depend on the history before Xn-1. Now, from any discrete-time Markov chain it's possible to construct a continuous-time Markov chain. We simply assign a parameter λ(j) to each state sj. When the discrete-time Markov chain finds itself in state j, we wait for a time which is exponentially distributed with parameter λ(j) (and therefore mean 1/λ(j)); after this time passes we proceed into another state according to the transition probabilities.
There's a problem, though, as Resnick explains in Section 5.2 -- it can happen that the chain "blows up", i. e. it makes infinitely many transitions in finite time.
The "pure birth" process has state space {1, 2, 3, ...}; the transition probabilities are pn,n+1 = 1 with all others zero; the parameters are λ(n) = λn. That is, the process waits in state n for a time which is exponentially distributed with mean 1/(λn) and then advances to state n+1. Not surprisingly, the population grows exponentially fast -- if the population is at n currently, it grows at rate λn.
But if you let λ(n) = λn2, then the process "blows up" almost surely. This isn't surprising from the point of view of a differential equation -- if the population is n, it grows at rate λn2. If we ignore the stochastic elements and model this as an ordinary differential equation, the population P satisfies dP/dt = λP2, and solutions to this have vertical asymptotes. Something similar (I don't pretend to know all the details) happens in the stochastic case.
I then thought, when would you get a population that grows like this? Consider not a population of living beings, but a population of ideas. And let us assume that new ideas come into being when two old ideas combine. Then in some idealized world where all pairs of ideas are constantly interacting, one might expect that the rate at which new ideas exist is proportional to the number of pairs of ideas. Of course, this is a silly model, because ideas don't interact all by themselves, but rather in people's brains -- and the population of people grows just exponentially. Still, this feels like it could explain why certain ideas seem to "come out of nowhere" -- at first in some area of intellectual inquiry there isn't much there because new ideas require new insights, but at some point combining old insights in new ways becomes a viable way to come up with new ideas.
(Of course, checking this against reality would be quite difficult. For one thing, how do you count ideas?)
28 May 2008
Subscribe to:
Post Comments (Atom)
1 comment:
For that matter, how do you separate sub-ideas? Or trace the proper genealogy of ideas? Or track their movement and interaction?
This post launched me into a science-fiction dream where, at some point in the future, computational linguistics and semantics and knowledge representation (and whatever other disciplines are necessary) have advanced to the point where we can monitor and track the population of ideas. I don't know what I would think of the data thus obtained, though. (And if that lead to new ideas, what delightful layers of recursion would exist!)
Post a Comment