From: glen herrmannsfeldt on
Richard Owlett wrote:

(I wrote)
>> Square waves are not orthogonal, which makes the transform
>> harder, but not impossible.

> Why would they be any less orthogonal than sets of sines and cosines?
> They should also span the space as well as sines and cosines.

They span the space fine, but aren't orthogonal.

>> Still, you might look at the ICT. It is a transform specifically
>> designed to make the forward transform fast on a machine with no
>> multiply instructions. You might find from there some of the
>> properties of transforms on non-orthogonal basis functions.

>> The ICT assumes that the inverse transform is done on a much
>> faster processor (on earth). The forward case can be done
>> fast on a CDP1802.

> By ICT do you mean Inverse Cosine Transform? That was the only relevant
> Google hit I found. My signal is real but not necessarily even (or odd).

> I'd be interested in anything that eliminated multiplications.

Integer Cosine Transform.

http://tda.jpl.nasa.gov/progress_report/42-119/119M.pdf

Consider that you have to do the forward transform on a CDP1802,
maybe similar in power to a 6502. You have a limited time in which
to do the transform, and limited memory for intermediate data.

The inverse can be done using the fastest hardware you can find,
and you are much less restricted in the time available. Such
restrictions led to an interesting new transform.
(Well, not so new now.)

-- glen

From: Richard Owlett on
Ron N. wrote:

> On Nov 25, 4:30 am, Richard Owlett <rowl...(a)atlascomm.net> wrote:
>
>>glen herrmannsfeldt wrote:
>>
>>>Richard Owlett wrote:
>>
>>>(snip)
>>
>>>>I'm not interested in the computation time but in whether or not I can
>>>>follow what's going on. Eventually I want to experiment with using
>>>>square waves instead of sinusoids as my basis functions. I suspect it
>>>>will be a better model for my situation.
>>
>>>Square waves are not orthogonal, which makes the transform
>>>harder, but not impossible.
>>
>>Why would they be any less orthogonal than sets of sines and cosines?
>>They should also span the space as well as sines and cosines.
>
>
> It depends on which waves. Even waves are orthogonal to odd
> waves (sinusoidal or square), periodic waves are orthogonal
> to waves at 2^N times the frequency (sinusoidal or square).
> But square waves are not orthogonal to square waves at three
> times the frequency (or other odd multiples), whereas
> sinusoids are.

Oh the trouble gotten into when reasoning by analogy without checking ;)

>
> There are Walsh functions which do span a space and are
> orthogonal, but they are not purely "square" waves, if square
> means periodic with a constant 50% duty cycle. Since they
> are not completely constant in periodicity or duty cycle,
> using them to determine "frequency content", for more common
> usages of that term, might be problematic.
>

I can't see an EXPLICIT requirement for 50% duty cycle. However I'm
looking for simplicity and a ZERO mean value can easily be shown to be a
requirement. So 50% duty cycle probably is IMPLICIT.

Thanks.


From: Richard Owlett on
glen herrmannsfeldt wrote:

> Richard Owlett wrote:
>
> (I wrote)
>
>>> Square waves are not orthogonal, which makes the transform
>>> harder, but not impossible.
>
>
>> Why would they be any less orthogonal than sets of sines and cosines?
>> They should also span the space as well as sines and cosines.
>
>
> They span the space fine, but aren't orthogonal.
>
>>> Still, you might look at the ICT. It is a transform specifically
>>> designed to make the forward transform fast on a machine with no
>>> multiply instructions. You might find from there some of the
>>> properties of transforms on non-orthogonal basis functions.
>
>
>>> The ICT assumes that the inverse transform is done on a much
>>> faster processor (on earth). The forward case can be done
>>> fast on a CDP1802.
>
>
>> By ICT do you mean Inverse Cosine Transform? That was the only
>> relevant Google hit I found. My signal is real but not necessarily
>> even (or odd).
>
>
>> I'd be interested in anything that eliminated multiplications.
>
>
> Integer Cosine Transform.
>
> http://tda.jpl.nasa.gov/progress_report/42-119/119M.pdf

Written by and FOR mathematicians ;<

>
> Consider that you have to do the forward transform on a CDP1802,

When Googling for CDP1802 to see if was what I recalled as an 1802, I found

"A CDP1802 is much like "486" or Pentium is to Windows PCs today"
http://www.cdp1802.org/cdp1802.html

OK, so what is a little context between friends ;)

> maybe similar in power to a 6502. You have a limited time in which
> to do the transform, and limited memory for intermediate data.
>
> The inverse can be done using the fastest hardware you can find,
> and you are much less restricted in the time available. Such
> restrictions led to an interesting new transform.
> (Well, not so new now.)
>
> -- glen
>
From: Mark Borgerding on
Richard Owlett wrote:
> In "Chapter 12 - The Fast Fourier Transform / FFT Programs" of _The
> Scientist and Engineer's Guide to Digital Signal Processing_
> Smith says "There are also FFT routines that completely eliminate the
> bit reversal sorting."
>
> Could someone point me to a description of one, preferably with BASIC or
> FORTRAN sample code.
>
> TIA


KissFFT does not use bit-reversed addressing.
I think the mixed-radix fft is not a good match for it.

If you are interested in seeing the derivation, I put together a pdf
that contains the derivation used in kissfft.

http://sourceforge.net/projects/kissfft
(go to download, browse all files)

-- Mark
From: Richard Owlett on
Mark Borgerding wrote:
> Richard Owlett wrote:
>
>> In "Chapter 12 - The Fast Fourier Transform / FFT Programs" of _The
>> Scientist and Engineer's Guide to Digital Signal Processing_
>> Smith says "There are also FFT routines that completely eliminate the
>> bit reversal sorting."
>>
>> Could someone point me to a description of one, preferably with BASIC
>> or FORTRAN sample code.
>>
>> TIA
>
>
>
> KissFFT does not use bit-reversed addressing.
> I think the mixed-radix fft is not a good match for it.
>
> If you are interested in seeing the derivation, I put together a pdf
> that contains the derivation used in kissfft.
>
> http://sourceforge.net/projects/kissfft
> (go to download, browse all files)
>
> -- Mark
Downloaded.
This group seems to insist I understand the math I took 40 years ago ;)
Thanks

First  |  Prev  |  Next  |  Last
Pages: 1 2 3 4 5 6 7 8
Prev: Scilab and DSP
Next: Separating Harmonic Spectra