Documentary filmFrom Encyclopedia of American Studies
Documentarists, unlike fiction filmmakers, feel obligated to justify their work not simply as entertainment but also often as amelioration. American documentary film has not followed Hollywood in dominating global entertainment. Like its European counterpart, American documentary's origins lay in science, still photography, photojournalism, anthropology, and a desire to render historical reality.