From: reginald on
I have a LPDC decoder that works with soft bit estimates. This is great for
the case of BPSK or QPSK, but I'd like to extend it to something like a
higher order QAM constellation.

Is there:

(a) a straightforward way to generate soft bit estimates from higher order
constellations like, say, 16QAM?

(b) more specifically, an extension to an LDPC decoder that works with
likelihoods of each bit? This seems complicated since in general your
"check nodes" and "bit nodes" will be correlated in some way that the
algorithm isn't anticipating.

I would greatly appreciate any light that you may shed on my problem, and
thanks in advance.

Regards,
Reggie


From: Vladimir Vassilevsky on


reginald wrote:
> I have a LPDC decoder that works with soft bit estimates. This is great for
> the case of BPSK or QPSK, but I'd like to extend it to something like a
> higher order QAM constellation.

This question is been regularly asked in this NG. Steve Pope explained
it no less then 10 times. Search the archives.

http://www.dsprelated.com/showmessage/107583/1.php


Vladimir Vassilevsky
DSP and Mixed Signal Design Consultant
http://www.abvolt.com
From: alos on

>(a) a straightforward way to generate soft bit estimates from higher
order
>constellations like, say, 16QAM?

Yes. Generally the approximation to the true likelihood of
marginal probability is used: LLR(bi) = 1/(2*sigma^2) *
d(S,bi=1)/d(S,bi=0)
Where sigma^2 is variance of the noise (plus channel gain estimate
variance) and
d(S,bi=1)/d(S,bi=0) are distances along I or Q axis (depending on which
axis the bi is defined) to the closest constellation point having bit 'bi'
equal to 1/0 respectively.


>(b) more specifically, an extension to an LDPC decoder that works with
>likelihoods of each bit? This seems complicated since in general your
>"check nodes" and "bit nodes" will be correlated in some way that the
>algorithm isn't anticipating.

I thought that standard LDPC works on bit likelihoods ??
What is being ignored in mentioned LLR calculations is in fact the
joint probability of all the bits given the symbol received. Instead
we calculate marginal likelihood thus pretending that each bit is
independent (and they are not). BTW. I suppose that introducing
the joint probabilities to the message passing algorithm would
by some BER...

So there are two approximations, first true probability is appoximated
with independent marginal probabilities of bits, second the marginal
probailities (likelihoods) are approximated with piecewise linear function
LLR(bi).


Regards,
Alek


>I would greatly appreciate any light that you may shed on my problem,
and
>thanks in advance.
>
>Regards,
> Reggie
>
>
>
From: Eric Jacobsen on
On 12/23/2009 6:34 AM, reginald wrote:
> I have a LPDC decoder that works with soft bit estimates. This is great for
> the case of BPSK or QPSK, but I'd like to extend it to something like a
> higher order QAM constellation.
>
> Is there:
>
> (a) a straightforward way to generate soft bit estimates from higher order
> constellations like, say, 16QAM?
>
> (b) more specifically, an extension to an LDPC decoder that works with
> likelihoods of each bit? This seems complicated since in general your
> "check nodes" and "bit nodes" will be correlated in some way that the
> algorithm isn't anticipating.
>
> I would greatly appreciate any light that you may shed on my problem, and
> thanks in advance.
>
> Regards,
> Reggie

This problem has been solved for many types of soft-input decoders, for
which the problem is essentially the same. You might look around for
how this is done with Turbo Codes, Viterbi decoders, etc., etc., and
then formulate an application for LDPCs.

It's actually a little easier for most LDPCs since there is no formal
trellis structure. The randomness of the code (assuming it's not a
highly structured code, which most LDPCs are not for performance
reasons), tends to naturally decorrelate the bits. In other words, the
symbol correlation that is usually decorrelated with a channel
interleaver is essentially done inside the code.

--
Eric Jacobsen
Minister of Algorithms
Abineau Communications
http://www.abineau.com
From: alos on

>Yes. Generally the approximation to the true likelihood of
>marginal probability is used: LLR(bi) = 1/(2*sigma^2) *
>d(S,bi=1)/d(S,bi=0)

Sorry. I meant LLR(bi) = 1/(2*sigma^2) * (d(S, bi=0) - d(S, bi=1))