Modern Optical Engineering (with Labs)
2017 Course Description
June 21, Wednesday afternoon
Optical Engineering for Biomedical Optics, Professor James Zavislan (Rochester):
Design and analysis of systems in which the index of refraction varies as a function of the spatial coordinates; applications to conventional optical systems, fiber optics, and medical imaging.
June 22, Thursday morning
Optical Testing & Instrumentation, Dr. Paul Murphy (QED Optics):
Interferometric optical testing, including Fizeau, Twyman-Green, Mach-Zehnder, Scatterplate, and Smartt point-diffraction interferometers are described for the testing of optical components and optical systems. Theory and applications of phase-shifting interferometers are discussed. Special techniques for the testing of aspheric surfaces are outlined..
June 22, Thursday afternoon
Diffractive and Micro-Optics Technology, Dr. Michael Morris (Rochester, RPC Photonics, Inc.and Apollo Optical Systems, Inc.):
Diffractive and micro-optics technology provides new degrees of freedom for the design and optimization of optical systems. In this course emphasis will be placed on recent advances in the design and fabrication of precision, micro-structured optical components and films, and with their applications in broadband imaging systems, laser-beam shaping, vision care, solid-state lighting and display systems.
June 23, Friday morning
Optical Thin Films, Dr. James B. Oliver (Rochester.):
Survey of applications for optical thin-film coatings; reflectance and transmittance at a boundary; vector methods and the Smith chart. Production considerations, including: vacuum evaporation; evaporation sources; uniformity calculations; thickness monitoring; chamber configuration; and materials.
June 23, Friday afternoon
Introduction to Electronic Imaging: A Systems Approach, Mr. Paul Kane (Eastman Kodak):
This course provides an overview of electronic imaging systems, describing the stages of image capture, digital processing, image output and viewing by a human observer. The student will become familiar with sampling and aliasing as they pertain to two-dimensional sensor arrays, digitization and sensor noise sources, CCD and CMOS architectures, and issues relating to correct matching of lenses and sensor arrays. Basic digital image processing algorithms such as demosaicing, deconvolution and sharpening will be discussed, and computational imaging concepts such as extended depth of field will be introduced. Digital image output in the form of projection displays, flat panel displays, and digital writers will be discussed. Finally, the spatial, temporal and chromatic response of the human visual system will be reviewed, in the context of setting specifications for electronic imaging systems.