IBM used machine learning and experimental Watson APIs, parsing out the trailers of 100 horror movies. It did visual, audio, and composition analysis of individual scenes, finding what makes each moment eerie, how the score and actors' tone of voice changed the mood--framing and lighting came together to make a complete trailer. Watson was then fed the full film, and it chose scenes for the trailer. A human--in this case, the "resident IBM filmmaker"--still needed to step in to edit for creativity. Even so, a process that would normally take weeks was reduced to hours.