Most directors believe editing (post-production) is the most important step in the making of a film. Although a shot is what you see, an edit is how you see it. An edit determines the experience for the viewer and entirely navigates the interpretation of a scene. And very often (unless intentionally otherwise), the viewer is completely unaware of this manipulation. They make it subconscious and seamless so that the viewers think they came to their own conclusions themselves.
Lately, especially with blockbuster films, we primarily see continuity editing– the style of editing that promotes verisimilitude and efficient story-telling. It requires little critical thinking on the viewers side and each shot naturally continues off of the last. This editing technique is usually used when editing is not a primary focus for the film’s development, but instead CGI or great acting performances. (This is actually why the Dark Knight trilogy is amazing– it has it all. The first scene that comes to mind is the parallel editing [alternation between two simultaneous strands of action] of this famous scene where The Joker is yelling “Hit me” while Batman speeds at him with the possible intent of actually killing him).
Unexpectedly, continuity editing is hard to uphold. There are several rules for it to maintain “sense.” Most importantly is the 180-degree rule. If the camera is on one “side” of a conversation (determined by an “axis of action” or “line of vision”), the camera must continue to be on that same side throughout cuts unless at some point the camera moves to the other side. Otherwise, it randomly appears as if the characters switched positions.
Slightly less important is the 30-degree rule. It states that the next shot must change in angle by at least 30 degrees from the current. Not only because a pan/tilt could easily replace the small angle cut, but also because viewers notice this and interpret it as almost an accidental stutter/jump in the film frames.
Continuity editing is easy on the viewer, but hard on the director. Is that what makes it the “default” choice? Because over the years, films have become less about creative expressionism and more about eye candy consumerism? And why in movies that are entirely fictional, do we try to maintain the realism of visual perception? Does that not contradict the fictionalism of the movie itself?