From: Chip Eastham on
On Aug 9, 7:31 pm, Chip Eastham <hardm...(a)gmail.com> wrote:
> On Aug 9, 6:19 pm, Ray Koopman <koop...(a)sfu.ca> wrote:
>
>
>
> > On Aug 9, 2:29 pm, Chip Eastham <hardm...(a)gmail.com> wrote:
>
> > > On Aug 9, 4:54 pm, Ray Koopman <koop...(a)sfu.ca> wrote:
>
> > >> I've used the following theorem in numeric work for some time,
> > >> and it certainly seems to be true, but I've never seen a proof.
> > >> Can anyone point to one (or give it, if it's easy)?
>
> > >> "If P is a fixed (n+1) by n matrix whose rows are the coordinates of
> > >>  the vertices of an n-simplex, and W is a random (n+1)-vector (row)
> > >>  whose density is Dirichlet(1,...,1), then W*P is uniform in the
> > >>  simplex."
>
> > > Perhaps it's a simple application of barycentric
> > > coordinates, but I don't know what "density is
> > > Dirichlet(1,...,1)" means in this context.
>
> > I mean that W has a Dirichlet distribution (e.g.,http://en.wikipedia.org/wiki/Dirichlet_distribution) with all alpha_i
> > = 1.  W contains the coordinates of a random point that is uniformly
> > distributed in the regular n-simplex whose vertices are given by the
> > rows of an identity matrix of order n+1. It seems to me that W*P is
> > then uniform in the simplex whose vertices are in P, but I need
> > something stronger than just "it seems to me...".
>
> A continuous uniform distribution is one that
> has a constant probability density function over
> the region where it is positive.
>
> The Wikipedia article, describing the Dirichlet
> distribution on a {K-1}-simplex, gives a formula
> for the pdf at (x_1,...,x_K) which is a constant
> for components of vector alpha identically 1:
>
> (1/B(alpha)) * PRODUCT x_i^{alpha_i - 1} [i=1,..K]
>
> and gives the details for calculating the constant
> of normalization B(alpha).
>
> regards, chip

So now we need to consider how the probability
density function for W*P relates to that of W.
But since its a linear transformation, the
change of variables involves a constant Jacobian
of the tranformation.

--c
From: Ray Koopman on
On Aug 9, 4:42 pm, Paul <paul_ru...(a)att.net> wrote:
> On Aug 9, 4:54 pm, Ray Koopman <koop...(a)sfu.ca> wrote:
>
>> I've used the following theorem in numeric work for some time,
>> and it certainly seems to be true, but I've never seen a proof.
>> Can anyone point to one (or give it, if it's easy)?
>>>
>> "If P is a fixed (n+1) by n matrix whose rows are the coordinates of
>> the vertices of an n-simplex, and W is a random (n+1)-vector (row)
>> whose density is Dirichlet(1,...,1), then W*P is uniform in the
>> simplex."
>
> You'll want to check this, as my best days are behind me, but ... Let
> P_1, ..., P_{n+1} be the rows of P, viewed as points in \Re^n, let
> S_1, ..., S_{n+1} be the vertices of the regular simplex described in
> your second message, and let A and b be n x n and n x 1 matrices
> respectively such that the affine transformation y = A*x + b maps each
> vertex of S to the corresponding vertex of P. Then W*P is the image
> under the affine transformation of W*S, so the probability of any
> subset of P occurring is the probability of its preimage occurring
> when you weight the vertices of S using W. You already know that
> distribution is uniform, so the probability of the preimage is the
> volume of the preimage divided by the volume of S, and that ratio is
> the same as the volume of the image divided by the volume of P since
> the volume of the image is the determinant of A (don't care what that
> is, just that it's not zero) times the volume of the preimage (and
> ditto for P resp. S).
>
> Hope that made sense -- USENET does not make expressing math easy.
>
> /Paul

Thank-you, Paul and Chip. That's what I needed.