It is surprisingly difficult to quantify an algorithm's contribution to the  state of the art. Reporting an algorithm's standalone performance wrongly  rewards near-clones while penalizing algorithms that have small but distinct  areas of strength. Measuring an algorithm's marginal contribution is better, but  penalizes sets of strongly correlated algorithms, thereby obscuring situations  in which it is essential to have at least one algorithm from such a set. Neither  of these measures takes time into account, penalizing algorithms that are no  longer state-of-the-art, but were when they were introduced. In this talk, I will  argue that contributions should be analyzed via a measure drawn from coalitional  game theory, the Shapley value, and its time-sensitive cousin, the temporal  Shapley value. The temporal Shapley Value maintains the desirable properties of  the Shapley Value, but allows to take the time algorithms were introduced into  account, within the context of the state of the art at the time. These measures  characterize the contribution of an algorithm fairly and can yield insight into  a research community's progress over time.

The proposed measures are applied to the famous quicksort algorithm and  algorithms from the AI areas of satisfiability and constraint programming. The  results illustrate the benefits of the Shapley Value and its temporal cousin.  Our measures give insights into how the field has developed over time and were  used in the 2021 AI Index to assess the progress of AI.