From: Yannick Duchêne (Hibou57) on
Lost in translation... do you know the movie ?

No, I'm joking, this is not about this 2002 movie, that's about SPARK, and
exactly, about translating some expressions in the rules language.

Previously, I have successfully added these rules, in an *.RLU file:

inequals(122): X/Y<1 may_be_deduced_from [ X>=0, Y>0, Y>X ].
inequals(123): X/Y<1 may_be_deduced_from [ X<=0, Y<0, Y<X ].
inequals(124): X/Y>(-1) may_be_deduced_from [ X>=0, Y<0, (-Y)>X ].
inequals(125): X/Y>(-1) may_be_deduced_from [ X<=0, Y>0, Y>(-X) ].


I wanted to do the same, for another kind of rule, about bitwise. The rule
I wanted to add is something like:
(X and 2 ** N) = 0 may be deduced from ((X / (2 ** N)) and 1) = 0
and the opposite

I tried with this:

bitwise(112): bit__and(X div (2**N), 1)=0 may_be_deduced_from [
(bit__and(X, 2**N)=0), X>=0, N>=0 ].
bitwise(113): bit__and(X, 2**N)=0 may_be_deduced_from [ (bit__and(X div
(2**N), 1)=0), X>=0, N>=0 ].

But this fails, although the simplifier does not returns me any syntax
error messages about this *.RLU file. This fails, because it cannot prove
something like this:

--# check ((Instance and 2) = 0) -> (((Instance / 2) and 1) = 0);

The latter should be directly deduced from the two previous rules if these
were OK. So I suppose something is wrong with these rules, and the
expression in “[...]” is not interpreted the way it seems to be to me.

One entertaining thing: I've noticed “and 1” is always replaced by “mod 2”
in the simplifier's *.SIV output files. As an example, the latter Check
clause is replaced by:

C1: bit__and(instance, 2) = 0 -> instance div 2 mod 2 = 0 .
From: Phil Thornley on
On 26 May, 11:09, Yannick Duchêne (Hibou57) <yannick_duch...(a)yahoo.fr>
wrote:
[...]
> I tried with this:
>
>     bitwise(112): bit__and(X div (2**N), 1)=0 may_be_deduced_from [  
> (bit__and(X, 2**N)=0), X>=0, N>=0 ].
>     bitwise(113): bit__and(X, 2**N)=0 may_be_deduced_from [ (bit__and(X div  
> (2**N), 1)=0), X>=0, N>=0 ].
>
> But this fails, although the simplifier does not returns me any syntax  
> error messages about this *.RLU file. This fails, because it cannot prove  
> something like this:
>
>     --# check ((Instance and 2) = 0) -> (((Instance / 2) and 1) = 0);
>
> The latter should be directly deduced from the two previous rules if these  
> were OK. So I suppose something is wrong with these rules, and the  
> expression in “[...]” is not interpreted the way it seems to be to me..

I don't think there us anything wrong with these rules, but the
Simplifier will not use a *combination* of user rules to discharge a
proof. (They are not as 'powerful' as the built-in rules as they are
not consulted until the Simplifier has tried all its other
strategies).

For example, section 7.3.1 of the Simplifier User Manual says:
"For an inference rule of the form
rulename(1): Goal may_be_deduced_from Conditions.
the Simplifier will attempt to find a means of instantiating
all of the wildcards in Goal such that it becomes an exact
match for an existing undischarged conclusion."

This means that a single inference rule is applied only if it
*directly* matches a conclusion.

Also, since user rules are applied right at the end, you have to use
the form of the conclusion as it appears in the SIV file, so:
> One entertaining thing: I've noticed “and 1” is always replaced by “mod 2”  
> in the simplifier's *.SIV output files. As an example, the latter Check  
> clause is replaced by:
>
>     C1:    bit__and(instance, 2) = 0 -> instance div 2 mod 2 = 0 .

If that is the conclusion left in the SIV file then the user rule to
prove it is:
user_bitwise(1): bit__and(X, 2) = 0 -> X div 2 mod 2 = 0
may_be_deduced .

(and a minor comment is that it is probably not a good idea to use the
Proof Checker rule family names for Simplifier user rules - it just
adds to the confusion about how the rules are used.)

Cheers,

Phil
From: Yannick Duchêne (Hibou57) on
Le Wed, 26 May 2010 12:38:52 +0200, Phil Thornley
<phil.jpthornley(a)googlemail.com> a écrit:
> I don't think there us anything wrong with these rules, but the
> Simplifier will not use a *combination* of user rules to discharge a
> proof. (They are not as 'powerful' as the built-in rules as they are
> not consulted until the Simplifier has tried all its other
> strategies).
>
> For example, section 7.3.1 of the Simplifier User Manual says:
> "For an inference rule of the form
> rulename(1): Goal may_be_deduced_from Conditions.
> the Simplifier will attempt to find a means of instantiating
> all of the wildcards in Goal such that it becomes an exact
> match for an existing undischarged conclusion."
>
> This means that a single inference rule is applied only if it
> *directly* matches a conclusion.
>
> Also, since user rules are applied right at the end, you have to use
> the form of the conclusion as it appears in the SIV file, so:
>> One entertaining thing: I've noticed “and 1” is always replaced by “mod
>> 2” in the simplifier's *.SIV output files. As an example, the latter
>> Check clause is replaced by:
>>
>> C1: bit__and(instance, 2) = 0 -> instance div 2 mod 2 = 0 ..
>
> If that is the conclusion left in the SIV file then the user rule to
> prove it is:
> user_bitwise(1): bit__and(X, 2) = 0 -> X div 2 mod 2 = 0
> may_be_deduced .
I remember about this, this was just that I did not suspected this had to
be a so exact match.

> (and a minor comment is that it is probably not a good idea to use the
> Proof Checker rule family names for Simplifier user rules - it just
> adds to the confusion about how the rules are used.)
Oops, sorry, I though this was better to go on with an existing naming
convention already established.


Overall so far to me.

I don't know what to think about one feeling, may be I'm wrong: I feel a
bit discouraged with SPARK now. I wonder if this is really this thing I
was waiting for so long (I am referring to some of my previous words about
it in another thread).

May be I should give up with it.

If I could believe they could be some funding or a market for that, I
would like to work on another system in this area... but I don't think so
(I don't think there are many people expecting this kind of things).

Well, for the time being, I feel I gonna forget about it at least for some
time and go on without it.

--
There is even better than a pragma Assert: a SPARK --# check.
From: Pascal Obry on

Yannick,

> Overall so far to me.
>
> I don't know what to think about one feeling, may be I'm wrong: I feel a
> bit discouraged with SPARK now. I wonder if this is really this thing I
> was waiting for so long (I am referring to some of my previous words
> about it in another thread).

Maybe you were expecting something that SPARK is not. SPARK is not a
replacement for Ada. SPARK is designed for highly secure software. You
won't create a Web server, an XML parser or even some kind of simulation
and GUI with it. In those application domains you need full Ada (binding
to external libraries, generics, full OO, standard libraries Ada.*,
Interfaces.* and possibly GNAT.*, full text and stream IO...).

I had to create a binding to the OS sockets in SPARK, I can tell you
that it was not easy... Especially as this was my first experience with
SPARK!

For embedded control-command application that's another story. I think
that SPARK has something invaluable to offer.

I have also thought that you can mix SPARK and Ada in the same
application. Using SPARK in the critical part, and Ada for the rest...
Don't know how well this would work as I have not gone through this yet.

Just my 2 cents of course!

Pascal.

--

--|------------------------------------------------------
--| Pascal Obry Team-Ada Member
--| 45, rue Gabriel Peri - 78114 Magny Les Hameaux FRANCE
--|------------------------------------------------------
--| http://www.obry.net - http://v2p.fr.eu.org
--| "The best way to travel is by means of imagination"
--|
--| gpg --keyserver keys.gnupg.net --recv-key F949BD3B

From: Dmitry A. Kazakov on
On Wed, 26 May 2010 16:15:41 +0200, Pascal Obry wrote:

> I have also thought that you can mix SPARK and Ada in the same
> application. Using SPARK in the critical part, and Ada for the rest...
> Don't know how well this would work as I have not gone through this yet.

Yes. I think this could be a direction in which Ada should evolve. It
should have a modular part equivalent to SPARK, which can be used with
certain compilation units. So that the programmer could choose the level of
safety he is ready to invest into.

It would be nice to be able to start the project at some middle level (a
bit higher than of present Ada, but lower than SPARK), and then gradually
adjust it, as the project evolves.

Another 2 cents.

--
Regards,
Dmitry A. Kazakov
http://www.dmitry-kazakov.de