From: LawCounsels on
Warm Greetings :

Am pleased now in position Press Release to experienced reputable
Forum members ( Atomic bomb was thought a 'Newtoninan'
impossibility , didnt stop many trying ... latest Iranian as yet
unsucessful attempt, N Korea much luckier , India /Pakistan did it
kept confidential .... C# knowledge prefers ) :


kindly please reply email will do :


" I shall keep all disclosure re new generation data compression
methods in commercial confidentiality , only ever to do only with
prior written consent "
will forward the mathematics basis overview & mathematics proofs , to
begin 'profit shares' confidential collaborations/ further
developments etc.


this forms the 'rigorous' mathematics basis , like so called
Einstein's maths overturned all Newtonian


this to begin 1st familiarise with an invented discovered
mathematical
object [ mathematics proven by Australia NSW senior Maths Professor,
&
Polish mathematician independently ] whereby any 'random (or not)' N
bits (iteration's input string ) with 2^N possibilities can ALWAYS
INVARIABLE be complete covered represented by ONLY a few smaller
number of lesser length bits_string of length N-1 or N-2 or
N-3 ... N-P [ P around log(base2)[N] ] .... you would then
definite able decide


=> near infinite data representations follows mathematically
[ naturally ]


Australian Professor was even more sceptical but am now content with
this 'new Maths foundations discovered like 50 years late than could
have been invented discovered then


with Kind Regards,
Intellectual Properties Holding International LTD
eFAX : +001 484 3464116




[ Q ] QUOTE (LawCounsels @ Aug 11 2010, 04:38 AM)
whereby any 'random (or not)' N
bits (iteration's input string ) with 2^N possibilities can ALWAYS
INVARIABLE be complete covered represented by ONLY a few smaller
number of lesser length bits_string of length N-1 or N-2 or
N-3 ... N-P [ P around log(base2)[N] ] .... you would then
definite able decide


Then the logical implication of being able to compress any sequence of
N bits to M<N bits is that by repeated application you end up with a
single bit. You're claiming you have a perfect lossless compression
algorithm for any input sequence, thus you're claiming you can encode
any amount of data of any time into a single bit, which is obviously
false. There's a mathematical theorem which formalises this.

Lossless compression algorithms rely on the fact the algorithm will be
applied to specific types of data. Human written text is easy to do,
since it follows specific rules. Compressing random noise losslessly
is impossible. Any compression algorithm which is lossless will be
able to make a file for random noise but it will be bigger than the
amount of data in the random noise (due to the algorithm file
overheads).

If you claim otherwise then you should be able to correctly decompress
a single bit into anything. For instance, the three sequences
00000000, 01010101 and 11111111 could be compressed to the single bit,
either 0 or 1. But if I give you 0 and 1 then you can only give me two
of those three sets with confidence (and that's ignoring the
infinitely many other sequences). Obviously your claim is impossible.



[ A ] >>whereby any 'random (or not)' N
bits (iteration's input string ) with 2^N possibilities can ALWAYS
INVARIABLE be complete covered represented by ONLY a few smaller
number of lesser length bits_string of length N-1 or N-2 or
N-3 ... N-P [ P around log(base2)[N] ] .... you would then
definite able decide

the mathematics proof (independent proven by reputable University
senoier maths professor, & Polish mathematicians) is as above
stated .... mathematically it does not follow that any bits string can
be compressed into 1 bit ... not difficult to deduce mathematically
this is impossible (starting from above stated :)



[ Q ] QUOTE (LawCounsels @ Aug 11 2010, 12:42 PM)
>>whereby any 'random (or not)' N
bits (iteration's input string ) with 2^N possibilities can ALWAYS
INVARIABLE be complete covered represented by ONLY a few smaller
number of lesser length bits_string of length N-1 or N-2 or
N-3 ... N-P [ P around log(base2)[N] ]


Those N bits don't sound too random, then.

[ A ] >>Those N bits don't sound too random, then


the maths proof (also further mundane practical verified TRUE by
'brute force'enumerations , like in Reimann's largest prime 'brute
force' ) shows works whether N bits 'random' or even if not random N
bits :)



[ Q ] QUOTE (LawCounsels @ Aug 11 2010, 07:19 PM)
>>Those N bits don't sound too random, then


the maths proof (also further mundane practical verified TRUE by
'brute force'enumerations , like in Reimann's largest prime 'brute
force' ) shows works whether N bits 'random' or even if not random N
bits


Yuk, yuk ... the definition of a "random bit string" is pretty much
"not compressible". No, I'm not going to give a long treatise
supporting that statement. Bye, bye.


[ A ] >>Yuk, yuk ... the definition of a "random bit string" is pretty
much "not compressible". No, I'm not going to give a long treatise
supporting that statement.


Newtonian defintion says pretty much the same thing .... thought there
was this 'relativity' definition which profound changes things ( or
just changed 'definitions' nothing more (?) )



[ Q ] QUOTE (AlphaNumeric @ Aug 11 2010, 12:30 PM)
Lossless compression algorithms rely on the fact the algorithm will be
applied to specific types of data. Human written text is easy to do,
since it follows specific rules. Compressing random noise losslessly
is impossible. Any compression algorithm which is lossless will be
able to make a file for random noise but it will be bigger than the
amount of data in the random noise (due to the algorithm file
overheads).


This is untrue.

Simple run-length encoding can loss-lessly compress random noise and
obtain a smaller byte count than the original. it's not LIKELY to
happen given true random data, but it is possible.

I can demonstrate this with a simple example if you would like.


[ A ] >>This is untrue.

>>Simple run-length encoding can loss-lessly compress random noise and obtain a smaller byte count than the original. it's not LIKELY to happen given true random data, but it is possible.


yes

like all Newtonian Mechanics will say Relativity simply untrue given
all known Newtonian many real observations examples

mathematics proof of Einstein was then understood by few , but these
few are the ones whom really matters


because all mankind's earth-confined Newtonian observations of light
in a straight-line , it took physicists some Solar Eclipse years to
affirm
new bend-light observations

earlier was it Corpenicus (?) says earth is round not flat , & for
decades no one
has observed this even after suggested

From: LawCounsels on
here is link to download a 'mathematics structure' encoding of a
complete random 4,074 bits long file (a random chosen part of Mark
Nelson's AMillionRandomDigits.bin challenge)

www dot box dot net/shared/eyy2v28dbf


download link's new discovered 'mathematics structure' endoded file's
details

.. in a file with N bits (sufficient large like 8Kbits onwards to
1Mbits) , assume the distributions of unique prefix '0's or '10' or
'110' or '1110' ... or '111...10' (always ends with a '0') is
standard binomial power (ie random) , BUT with added constraints/
restriction that the maximum length of '111....10' at any time could
only be at most log(base2)[R] where R is the total # of bits from
present bit position to the end Least Significant Bit position [ eg
if
bits_string is 0 111110 11110 10 0 10 110 10 0 10 0 10 0 0 '
then here 110 is the 13th bits , there can be no unique prefix of
total# of bits length > log(base2)[R] at any time, where R is the bit
position # of the unique prefix counting from end Least Significant
position bit ....]


.....so this is not simple regular usual normal binomial power series/
random distributions [ usual normal binomial power series/ random
distributions is 'god gifted' not to be compressible] , but here
there
is added constraint/ restriction that eg '1110' ( of length 4 bits
long) could not occur at position R = 15 or smaller (since log(base2)
[R=15 or smaller value of R] will not allow unique prefix of total#
of
bits >= 4 to occur at position R < 16 ......


THIS IS IMMEDIATE APPARENT READY COMPRESSIBLE 4,073 bits 'constraint'
'mathematics structures' encoded (1 bit smaller) file , from input
random 4,074 bits 'random' file