Computational photography (CP) is a new field that explores and is about to redefine how we take photographs and videos. Applications of CP are not only "everyday" photography but also new methods for scientific imaging, such as microscopy, biomedical imaging, and astronomical imaging, and can thus be expected to have a significant impact in many areas. There is an apparent convergence of methods, what we have traditionally called "image processing", and recently many works in machine vision, all of which seem to be addressing very much the same, if not tightly related problems. These include deblurring, denoising, and enhancement algorithms of various kinds.

  • What do we learn from this convergence and its application to CP?
  • Can we create more contact between the practitioners of these fields, who often do not interact?
  • Does this convergence mean that the fields are intellectually shrinking to the same point, or expanding and hence overlapping with each other more?

Besides discussing such questions, the goal of this workshop is two-fold:

  • (i) to present the current approaches, their possible limitations, and open problems of CP to the NIPS community, and
  • (ii) to foster interaction between researchers from machine learning, neuro science and CP to advance the state of the art in CP.

The key of the existing CP approaches is to combine (i) creative hardware designs with (ii) sophisticated computations, such as e.g. new approaches to blind deconvolution. This interplay between both hardware and software is what makes CP an ideal real-world domain for the whole NIPS community, who could contribute in various ways to its advancement, be it by enabling new imaging devices that are possible due to the latest machine learning methods or by new camera and processing designs that are inspired by our neurological understanding of natural visual systems. Thus the target group of participants are researchers from the whole NIPS community (machine learning and neuro science) and researchers working on CP and related fields.