Writing for section 6

This commit is contained in:
Jonathan Chan 2025-04-20 11:05:36 -04:00
parent 6aa8f1049c
commit dc84c63523
5 changed files with 96 additions and 40 deletions

View File

@ -500,7 +500,7 @@ Furthermore, $[[m ⤳ n]]$ isn't really a full reduction relation on its own,
since it's defined mutually with strongly normal terms,
and it doesn't describe reducing anywhere other than in the head position.
The solution in POPLMark Reloaded is to prove that $\kw{SN}$ is sound
The solution we take from POPLMark Reloaded is to prove that $\kw{SN}$ is sound
with respect to a traditional presentation of strongly normal terms
based on full small-step reduction $[[v ⇝ w]]$, $[[m ⇝ n]]$ of values and computations.
I omit here the definitions of these reductions,
@ -519,10 +519,7 @@ This rules out looping and diverging terms, since they never stop stepping.
\drule[]{sn-Com}
\end{mathpar}
\drules[sr]{$[[m ⤳' n]]$}{strong head reduction}{Thunk,Lam,Ret,Inl,Inr,App,Let}
\drules[ne]{$[[m ∈ ne]]$}{neutral computations}{Force,App,Let,Case}
The inversion lemmas we need are easily proven by induction.
\begin{mathpar}
\mprset{fraction={-~}{-~}{-}}
@ -537,10 +534,28 @@ This rules out looping and diverging terms, since they never stop stepping.
using congruence of single-step reduction.
\end{proof}
\begin{definition}[Neutral values]
A value is neutral, written $[[v ∈ ne]]$, if it is a variable.
In contrast, while congruence rules for $\kw{SN}$ hold by definition,
they require some work to prove for $\kw{sn}$.
The strategy is to mirror the inductive characterization,
and define corresponding notions of neutral terms and head reduction,
with strongly neutral terms being those both neutral and strongly normal.
As before, variables are the only neutral value,
but we write $[[v ∈ ne]]$ to say that $[[v]]$ is a variable for symmetry.
\drules[sr]{$[[m ⤳' n]]$}{strong head reduction}{Thunk,Lam,Ret,Inl,Inr,App,Let}
\drules[ne]{$[[m ∈ ne]]$}{neutral computations}{Force,App,Let,Case}
\begin{definition}[Strongly neutral computations]
A strongly neutral computation, written $[[m ∈ sne]]$,
is a computation that is both neutral and strongly normalizing,
\ie $[[m ∈ ne]]$ and $[[m ∈ sn]]$.
\end{definition}
The admissible congruence rules for $\kw{sn}$
mirror exactly the congruence constructors for $\kw{SNe}$ and $\kw{SN}$,
replacing these by $\kw{sne}$ and $\kw{sn}$ in each premise.
Proving them additionally requires showing that small-step reduction preserves neutrality.
\begin{lemma}[Preservation] \label{lem:preservation} \leavevmode
\begin{enumerate}
\item If $[[v ⇝ w]]$ and $[[v ∈ ne]]$ then $[[w ∈ ne]]$.
@ -552,12 +567,6 @@ This rules out looping and diverging terms, since they never stop stepping.
By mutual induction on the single-step reductions.
\end{proof}
\begin{definition}[Strongly neutral computations]
A strongly neutral computation, written $[[m ∈ sne]]$,
is a computation that is both neutral and strongly normalizing,
\ie $[[m ∈ ne]]$ and $[[m ∈ sn]]$.
\end{definition}
{\mprset{fraction={-~}{-~}{-}}
\drules[sn]{$[[v ∈ sn]]$}{admissible strongly normal values}{Var,Unit,Inl,Inr,Thunk}
\drules[sn]{$[[m ∈ sn]]$}{admissible strongly normal computations}{Lam,Ret,Force,App,Let,Case}
@ -581,7 +590,11 @@ This rules out looping and diverging terms, since they never stop stepping.
or by induction on their sole premise.
\end{proof}
% admissible rules missing backward closure \rref{SN-Red}
The only missing correponding rule is that for \rref{SN-Red},
which backward closes strongly normal terms under head reduction.
Proving this for $\kw{sn}$ requires much more work and many more intermediate lemmas.
To show backward closure under $\beta$-reduction, or \emph{head expansion},
we first need antisubstitution to be able to undo substitutions.
\begin{lemma}[Antisubstitution (\textsf{sn})] \label{lem:antisubst-sn}
If $[[m{x ↦ v} ∈ sn]]$ and $[[v ∈ sn]]$ then $[[m ∈ sn]]$.
@ -608,7 +621,14 @@ This rules out looping and diverging terms, since they never stop stepping.
is used to satisfy the $\beta$-reduction cases.
\end{proof}
\begin{lemma}[Confluence] \label{lem:confluence}
Now remains backward closure under congruent reduction in head position.
Because showing that a term is strongly normal involves single-step reduction,
we need to ensure that head reduction and single-step reduction commute
and behave nicely with each other.
Proving commutativity is a lengthy proof that involves careful analysis
of the structure of both reductions.
\begin{lemma}[Commutativity] \label{lem:commute}
If $[[m ⇝ n1]]$ and $[[m ⤳' n2]]$, then either $[[n1]] = [[n2]]$,
or there exists some $[[m']]$ such that
$[[n1 ⤳' m']]$ and $[[n2 ⇝ m']]$.
@ -627,7 +647,8 @@ This rules out looping and diverging terms, since they never stop stepping.
\end{mathpar}
\begin{proof}
First, use \rref{sn-App-inv,sn-Let-inv} to obtain $[[m ∈ sn]]$.
First, use \rref{sn-App-inv1,sn-App-inv2,sn-Let-inv1,sn-Let-inv2}
to obtain $[[m ∈ sn]]$ and $[[v ∈ sn]]$/$[[n ∈ sn]]$.
Then proceed by double induction on the derivations of
$[[m ∈ sn]]$ and $[[v ∈ sn]]$/$[[n ∈ sn]]$,
again as lexicographic induction.
@ -635,7 +656,7 @@ This rules out looping and diverging terms, since they never stop stepping.
then it steps to a strongly normal term.
Strong reduction of the head position eliminates the cases of $\beta$-reduction,
leaving the cases where the head position steps or the other position steps.
If the head position steps, we use \nameref{lem:confluence}
If the head position steps, we use \nameref{lem:commute}
to join the strong reduction and the single-step reduction together,
then use the first induction hypothesis.
Otherwise, we use the second induction hypothesis.
@ -643,6 +664,8 @@ This rules out looping and diverging terms, since they never stop stepping.
since we have no congruence rule for when the head position is not neutral.
\end{proof}
All of these lemmas at last give us backward closure.
\begin{lemma}[Backward closure] \label{lem:closure-sn}
If $[[m ⤳' n]]$ and $[[n ∈ sn]]$ then $[[m ∈ sn]]$.
\end{lemma}
@ -653,6 +676,13 @@ This rules out looping and diverging terms, since they never stop stepping.
\rref{sn-Force-Thunk,sn-App-Lam,sn-Let-Ret,sn-Case-Inl,sn-Case-Inr,sn-App-bwd,sn-Let-bwd}.
\end{proof}
This gives us all the pieces to show that the inductive characterization
is sound with respect to the traditional presentation of strongly normal terms,
using neutral terms and head reduction as intermediate widgets to strengthen the induction hypotheses,
although we can show soundness for neutrality independently.
With soundness of strongly normal terms combined with soundness of syntactic typing,
traditional strong normalization of well typed terms holds as a corollary.
\begin{lemma}[Soundness (\textsf{SNe})] \label{lem:sound-sne}
If $[[m ∈ SNe]]$ then $[[m ∈ ne]]$.
\end{lemma}

View File

@ -645,13 +645,11 @@ case (inr v) | y.m | z.n ∈ sn
% closure
m ⤳' n
v ∈ sn
n v ∈ sn
-------- :: App_bwd
m v ∈ sn
m ⤳' m'
n ∈ sn
let x m' n ∈ sn
--------------- :: Let_bwd
let x m n ∈ sn

BIN
cbpv.pdf (Stored with Git LFS)

Binary file not shown.

View File

@ -500,7 +500,7 @@ Furthermore, $ \ottnt{m} \leadsto \ottnt{n} $ isn't really a full reduction re
since it's defined mutually with strongly normal terms,
and it doesn't describe reducing anywhere other than in the head position.
The solution in POPLMark Reloaded is to prove that $\kw{SN}$ is sound
The solution we take from POPLMark Reloaded is to prove that $\kw{SN}$ is sound
with respect to a traditional presentation of strongly normal terms
based on full small-step reduction $ \ottnt{v} \rightsquigarrow \ottnt{w} $, $ \ottnt{m} \rightsquigarrow \ottnt{n} $ of values and computations.
I omit here the definitions of these reductions,
@ -519,10 +519,7 @@ This rules out looping and diverging terms, since they never stop stepping.
\drule[]{sn-Com}
\end{mathpar}
\drules[sr]{$ \ottnt{m} \leadsto_{ \kw{sn} } \ottnt{n} $}{strong head reduction}{Thunk,Lam,Ret,Inl,Inr,App,Let}
\drules[ne]{$ \ottnt{m} \in \kw{ne} $}{neutral computations}{Force,App,Let,Case}
The inversion lemmas we need are easily proven by induction.
\begin{mathpar}
\mprset{fraction={-~}{-~}{-}}
@ -537,10 +534,28 @@ This rules out looping and diverging terms, since they never stop stepping.
using congruence of single-step reduction.
\end{proof}
\begin{definition}[Neutral values]
A value is neutral, written $ \ottnt{v} \in \kw{ne} $, if it is a variable.
In contrast, while congruence rules for $\kw{SN}$ hold by definition,
they require some work to prove for $\kw{sn}$.
The strategy is to mirror the inductive characterization,
and define corresponding notions of neutral terms and head reduction,
with strongly neutral terms being those both neutral and strongly normal.
As before, variables are the only neutral value,
but we write $ \ottnt{v} \in \kw{ne} $ to say that $\ottnt{v}$ is a variable for symmetry.
\drules[sr]{$ \ottnt{m} \leadsto_{ \kw{sn} } \ottnt{n} $}{strong head reduction}{Thunk,Lam,Ret,Inl,Inr,App,Let}
\drules[ne]{$ \ottnt{m} \in \kw{ne} $}{neutral computations}{Force,App,Let,Case}
\begin{definition}[Strongly neutral computations]
A strongly neutral computation, written $ \ottnt{m} \in \kw{sne} $,
is a computation that is both neutral and strongly normalizing,
\ie $ \ottnt{m} \in \kw{ne} $ and $ \ottnt{m} \in \kw{sn} $.
\end{definition}
The admissible congruence rules for $\kw{sn}$
mirror exactly the congruence constructors for $\kw{SNe}$ and $\kw{SN}$,
replacing these by $\kw{sne}$ and $\kw{sn}$ in each premise.
Proving them additionally requires showing that small-step reduction preserves neutrality.
\begin{lemma}[Preservation] \label{lem:preservation} \leavevmode
\begin{enumerate}
\item If $ \ottnt{v} \rightsquigarrow \ottnt{w} $ and $ \ottnt{v} \in \kw{ne} $ then $ \ottnt{w} \in \kw{ne} $.
@ -552,12 +567,6 @@ This rules out looping and diverging terms, since they never stop stepping.
By mutual induction on the single-step reductions.
\end{proof}
\begin{definition}[Strongly neutral computations]
A strongly neutral computation, written $ \ottnt{m} \in \kw{sne} $,
is a computation that is both neutral and strongly normalizing,
\ie $ \ottnt{m} \in \kw{ne} $ and $ \ottnt{m} \in \kw{sn} $.
\end{definition}
{\mprset{fraction={-~}{-~}{-}}
\drules[sn]{$ \ottnt{v} \in \kw{sn} $}{admissible strongly normal values}{Var,Unit,Inl,Inr,Thunk}
\drules[sn]{$ \ottnt{m} \in \kw{sn} $}{admissible strongly normal computations}{Lam,Ret,Force,App,Let,Case}
@ -581,7 +590,11 @@ This rules out looping and diverging terms, since they never stop stepping.
or by induction on their sole premise.
\end{proof}
% admissible rules missing backward closure \rref{SN-Red}
The only missing correponding rule is that for \rref{SN-Red},
which backward closes strongly normal terms under head reduction.
Proving this for $\kw{sn}$ requires much more work and many more intermediate lemmas.
To show backward closure under $\beta$-reduction, or \emph{head expansion},
we first need antisubstitution to be able to undo substitutions.
\begin{lemma}[Antisubstitution (\textsf{sn})] \label{lem:antisubst-sn}
If $ \ottnt{m} [ \ottmv{x} \mapsto \ottnt{v} ] \in \kw{sn} $ and $ \ottnt{v} \in \kw{sn} $ then $ \ottnt{m} \in \kw{sn} $.
@ -608,7 +621,14 @@ This rules out looping and diverging terms, since they never stop stepping.
is used to satisfy the $\beta$-reduction cases.
\end{proof}
\begin{lemma}[Confluence] \label{lem:confluence}
Now remains backward closure under congruent reduction in head position.
Because showing that a term is strongly normal involves single-step reduction,
we need to ensure that head reduction and single-step reduction commute
and behave nicely with each other.
Proving commutativity is a lengthy proof that involves careful analysis
of the structure of both reductions.
\begin{lemma}[Commutativity] \label{lem:commute}
If $ \ottnt{m} \rightsquigarrow \ottnt{n_{{\mathrm{1}}}} $ and $ \ottnt{m} \leadsto_{ \kw{sn} } \ottnt{n_{{\mathrm{2}}}} $, then either $\ottnt{n_{{\mathrm{1}}}} = \ottnt{n_{{\mathrm{2}}}}$,
or there exists some $\ottnt{m'}$ such that
$ \ottnt{n_{{\mathrm{1}}}} \leadsto_{ \kw{sn} } \ottnt{m'} $ and $ \ottnt{n_{{\mathrm{2}}}} \rightsquigarrow \ottnt{m'} $.
@ -627,7 +647,8 @@ This rules out looping and diverging terms, since they never stop stepping.
\end{mathpar}
\begin{proof}
First, use \rref{sn-App-inv,sn-Let-inv} to obtain $ \ottnt{m} \in \kw{sn} $.
First, use \rref{sn-App-inv1,sn-App-inv2,sn-Let-inv1,sn-Let-inv2}
to obtain $ \ottnt{m} \in \kw{sn} $ and $ \ottnt{v} \in \kw{sn} $/$ \ottnt{n} \in \kw{sn} $.
Then proceed by double induction on the derivations of
$ \ottnt{m} \in \kw{sn} $ and $ \ottnt{v} \in \kw{sn} $/$ \ottnt{n} \in \kw{sn} $,
again as lexicographic induction.
@ -635,7 +656,7 @@ This rules out looping and diverging terms, since they never stop stepping.
then it steps to a strongly normal term.
Strong reduction of the head position eliminates the cases of $\beta$-reduction,
leaving the cases where the head position steps or the other position steps.
If the head position steps, we use \nameref{lem:confluence}
If the head position steps, we use \nameref{lem:commute}
to join the strong reduction and the single-step reduction together,
then use the first induction hypothesis.
Otherwise, we use the second induction hypothesis.
@ -643,6 +664,8 @@ This rules out looping and diverging terms, since they never stop stepping.
since we have no congruence rule for when the head position is not neutral.
\end{proof}
All of these lemmas at last give us backward closure.
\begin{lemma}[Backward closure] \label{lem:closure-sn}
If $ \ottnt{m} \leadsto_{ \kw{sn} } \ottnt{n} $ and $ \ottnt{n} \in \kw{sn} $ then $ \ottnt{m} \in \kw{sn} $.
\end{lemma}
@ -653,6 +676,13 @@ This rules out looping and diverging terms, since they never stop stepping.
\rref{sn-Force-Thunk,sn-App-Lam,sn-Let-Ret,sn-Case-Inl,sn-Case-Inr,sn-App-bwd,sn-Let-bwd}.
\end{proof}
This gives us all the pieces to show that the inductive characterization
is sound with respect to the traditional presentation of strongly normal terms,
using neutral terms and head reduction as intermediate widgets to strengthen the induction hypotheses,
although we can show soundness for neutrality independently.
With soundness of strongly normal terms combined with soundness of syntactic typing,
traditional strong normalization of well typed terms holds as a corollary.
\begin{lemma}[Soundness (\textsf{SNe})] \label{lem:sound-sne}
If $ \ottnt{m} \in \kw{SNe} $ then $ \ottnt{m} \in \kw{ne} $.
\end{lemma}

View File

@ -985,7 +985,6 @@
\newcommand{\ottdrulesnXXAppXXbwd}[1]{\ottdrule[#1]{%
\ottpremise{ \ottnt{m} \leadsto_{ \kw{sn} } \ottnt{n} }%
\ottpremise{ \ottnt{v} \in \kw{sn} }%
\ottpremise{ \ottnt{n} \gap \ottnt{v} \in \kw{sn} }%
}{
\ottnt{m} \gap \ottnt{v} \in \kw{sn} }{%
@ -995,7 +994,6 @@
\newcommand{\ottdrulesnXXLetXXbwd}[1]{\ottdrule[#1]{%
\ottpremise{ \ottnt{m} \leadsto_{ \kw{sn} } \ottnt{m'} }%
\ottpremise{ \ottnt{n} \in \kw{sn} }%
\ottpremise{ \kw{let} \gap \ottmv{x} \leftarrow \ottnt{m'} \gap \kw{in} \gap \ottnt{n} \in \kw{sn} }%
}{
\kw{let} \gap \ottmv{x} \leftarrow \ottnt{m} \gap \kw{in} \gap \ottnt{n} \in \kw{sn} }{%