There is something that is very alluring about Hollywood – the American film industry. The HBO comedy-drama series ‘Entourage’ is the only series that exposes the industry in the most brutal yet honest way, thereby allowing the audience more than just a voyeuristic view of the industry.