The walking dead have been a mainstay of horror films for decades. Although usually confined to the West Indies, Hollywood has made sure audiences will fear zombies anywhere and everywhere.
Save my name, email, and website in this browser for the next time I comment.