Escolar Documentos
Profissional Documentos
Cultura Documentos
Robert Harper
Carnegie Mellon University
(With Umut Acar and Guy Blelloch)
The Problem
• Given a static algorithm, obtain a dynamic,
or incremental, version.
– Maintain a sorted list under insertions and
deletions
– Maintain a convex hull under motions of the
points.
– Maintain semantics of a program under edits.
Example: Sorting
Input:
5,
5, 1,
1, 4,
2, 2,
3 3
Output:
1,
1, 2,
2, 3,
3, 4,
5 5
Dynamic Algorithms
• There is a large body of work on
dynamic / incremental algorithms.
– Specific techniques for specific problems.
• Our interest is in general methods,
rather than ad hoc solutions.
– Applying them to a variety of problems.
– Understanding when these methods apply.
Self-Adjusting Computation
• Self-adjusting computation is a method
for “dynamizing” a static algorithm.
– Start with a static algorithm for a problem.
– Make it robust under specified changes.
• Goal: “fast” response to “small” change.
– “Fast” and “small” are problem-specific!
– As ever, the analysis can be difficult.
Self-Adjusting Computation
• Generalizes incremental computation.
– Attribute grammars, circuit models assume static
control dependencies.
– SAC permits dynamic dependencies.
• Combines algorithmic and programming
language techniques.
– Linguistic tools to ensure correctness relative to
static algorithm.
– Algorithmic techniques for efficient implementation.
Self-Adjusting Computation
• Adaptivity:
Propagate the effects on the output of a
change to the input.
• Selective Memoization:
Reuse old results, provided they are valid
after change.
• Adaptive Memoization:
Reuse old results, even though they may not
be valid after change.
Model of Computation
• Purely functional programming model.
– Data structures are persistent.
– No implicit side effects or mutation.
• Imperative model of change.
– Run on initial input to obtain output.
– Make modifications to the input.
– Propagate changes to the output.
Model of Computation
Steps of Execution
i(
mehpe/ er up m (pure/persistent)
Rf o s egat S
A Simple Example: Map
data cell = nil | cons of int £ list
and list = cell
12 13 14
Dynamic Version of Map
• To permit insertions and deletions, lists
are made modifiable:
data cell =
nil | cons of int £ list
and list =
cell mod
Dynamic Version of Map
Insertion changes a modifiable:
2 3 5
4
Dynamic Version of Map
We’d like to obtain the result …
12 13 15
14
Dynamic Version of Map
• Can we update the result in O(1) time?
– Make one new call to map.
– Splice new cell into “old” result.
• Yes, using self-adjusting computation!
– Adaptivity: call map on the new node.
– Memoization: re-synchronize with suffix.
Adaptivity Overview
• To make map adaptive, ensure that
– Changes invalidate results that depend on
the modified value.
– Computations dependent on a change are
re-run with the “new” value.
• Two key ideas:
– Make access to modifiables explicit.
– Maintain dependencies dynamically.
Adaptive Map
data cell = nil | cons of int £ list
and list = cell mod
Allocate new modifiable
fun map (l:list) =
mod
(let mod c = l in write(map’ c))
and map’ c =
case c of nil ) nil
| cons
Write new (h, t) ) cons
modifiable (h+10,
Read map t)
old modifiable
Adaptive Map
Modification to input:
2 3 5
Modified cell is
argument of map’, 4
which must be re-run.
Adaptive Map
• Associated output is invalidated, and
suffix is re-created.
12 13 15
15
14
Result of map’ written here.
Adaptive Programming
• Crux: dependencies among modifiables.
– Writing a modifiable invalidates any
computation that reads it.
– One read can be contained within another.
• Dependencies are fully dynamic!
– Cells are allocated dynamically.
– Reads affect control flow.
Adaptive Programming
• Change propagation consists of
– Re-running readers of changed cells.
– Updating dependencies during re-run.
• To ensure correctness,
– All dependencies must be accurately tracked.
– Containment ordering must be maintained.
• Linguistic tools enforce these requirements!
Type System for Adaptivity
• The type τ mod is a modality.
– From lax modal logic.
– And therefore forms a monad.
• Two modes of expression:
– Stable: ordinary functional code, not
affected by changes.
– Changeable: affected by change, will be
written to another modifiable.
Type System for Adaptivity
• Elimination form for τ mod:
let mod x:τ = s in c end
– Read modifiable given by s.
– Bind value to x:τ , evaluate c.
• Makes dependencies explicit:
– Records read of given modifiable.
– Re-run c with new x if changed.
– Reads within c are contained in this read.
Type System for Adaptivity
• Execution maintains a trace of adaptive events.
– Creation of a modifiable.
– Writes to a modifiable.
– Reads of a modifiable.
• Containment is recorded using Sleator-Dietz
order maintenance algorithm.
– Associate time intervals with events.
– Re-run reader within the “old” time interval.
– Requires arbitrary fractionation of time steps.
Adaptive Map
• Responds to insertion in linear time:
12 13 15
14
Memoizing Map
• For constant-time update, we must re-
synchronize with the old result.
– Results after insertion point remain valid
despite change.
– Re-use, rather than recompute, to save
time.
• Selective memoization is a general
technique for achieving this.
Selective Memoization
• Standard memoization is data-driven.
– Associate with a function f a finite set of
ordered pairs (x, f(x)).
– Consult memo table before call, update
memo table after call.
• Cannot handle partial dependencies.
– Eg, read only first 10 elements of an array.
– Eg, use an approximation of input.
Selective Memoization
• Selective memoization is control-driven.
– Guided by the exploration of the input.
– Sensitive to approximations.
• Associate results with control paths.
– “Have I been here before?”
– Control path records dependencies of
output on input.
Memoized Adaptive Map
Depends only on
fun map (l:list) =
nil/cons.
mod
(let mod c = l in write(map’ c))
and memo map’ c = Depends on nil/cons,
mcase c of nil ) head, and tail.
return (nil)
| cons (h, t) )
let !h’=h and !t’=t in
return(cons (h’+10, map t’))
Memoized Adaptive Map
• With selective memoization we obtain
2 3 5
Unrestricted
Adaptive Memoization
• Effectiveness of memoization depends
on preserving identity.
– Modifiables compare “by reference”.
– Copying a modifiable impedes re-use.
• This conflicts with the functional
programming model.
– Eg, functional insertions copy structure.
– Undermines effectiveness of memoization.
Adaptive Memoization
• Consider again applying map to:
2 3 5
12 13 15
Adaptive Memoization
• Now functionally insert an element:
2 3 5
4
Adaptive Memoization
• Running map on result yields
12 13 15
14
Adaptive Memoization
• Subsequent runs propagate the effect:
22 23 25
24
Adaptive Memoization
• Ideally, we’d re-use the “old” prefix!
12 13 15
14
1743 17
43
1743 17
43
Adaptive Memoization
• Permit inaccurate memoization.
– Allows recovery of “old” cells.
– Cached result will be incorrect.
• Adapt incorrect result to restore
correctness.
– Use change propagation to revise answer.
– Only sensible in conjunction with
adaptivity!
Adaptive Memoization
fun map (l:list) = …
and memo map’ c =
mcase c of nil )
Do not record
return (nil) dependency!
| cons (h, t) )
Memo match only on
let !h’=h in
nil/cons and head.
let ?t’=t in
return(cons (h’+10, map t’))
Adaptively Memoized Map
• On the initial input …
2 3 5
12 13 15
Adaptively Memoized Map
• After a functional update, the input is
2 3 5
4
Adaptively Memoized Map
• Now maps yields the inaccurate result:
12 13 15
Tail is incorrect!
Result of map
2 3 5
4
Adaptively Memoized Map
• Change to input propagates to output
12
12 13
13 15
15
14
Some Results
• Quicksort: expected O(lg n) update after
insert or delete at a random position.
• Mergesort: expected O(lg n) update after
insert or delete.
• Tree Contraction: O(lg n) update after
adding or deleting an edge.
• Kinetic Quickhull: O(lg n) per event
measured.
Ongoing Work
• When can an algorithm be dynamized?
– Consider edit distance between traces for
a class of input changes.
– Small edit distance suggests we can build
a dynamic version using SAC.
• What is a good semantic model?
– Current methods are rather ad hoc.
– Are there better models?
Conclusion
• Self-adjusting computation is a powerful
method for building dynamic algorithms.
– Systematic methodology.
– Simple correctness criteria.
– Easy to implement.
• The interplay between linguistic and
algorithmic methods is vital!
– λ is also a powerful algorithmic tool!
Questions?