Anonymous
Anonymous asked in TravelUnited StatesLos Angeles · 2 months ago

Did Hollywood destroy California?

9 Answers

Relevance
  • 2 months ago

    GREED is tho only destructive force in America..

  • 2 months ago

    Hollywood made California.  After the Gold Rush, it was Hollywood and the seat of visual media that made Los Angeles the second most populated city and attracted world class businesses during the course of the 20th century.  I would argue one party rule destroyed California.  When 2/3rds of all seats are held by the far left, things tend to turn ugly.

    As for Andre's exceedingly ignorant post, there hasn't been a Republican controlled Senate in California since 1974, Republican controlled Assembly since 1996, and a Republican Governor since 2011.  All three houses have been consistently Democrat for a decade.

    From 2011 to 2019, there was an ever increasing mass exodus from the state. Last year, due to COVID-19, the exit rate dropped 5.4%. Even I was supposed to have moved last year, but my contract in South Carolina was cancelled because of COVID-19.

  • 2 months ago

    Hollywood, conceived out of a craving to stay away from Edison's protected innovation claims, immediately turned into the essential area of the film business.

  • 2 months ago

    No, Republicans did.

    Attachment image
  • How do you think about the answers? You can sign in to vote the answer.
  • 2 months ago

    i don't think so!!!

  • Foofa
    Lv 7
    2 months ago

    No. California destroyed Hollywood. 

  • 2 months ago

    No, Hollywood did not destroy California.

  • hihi!
    Lv 7
    2 months ago

    GREED is tho only destructive force in America... perhaps in the world.

  • Anonymous
    2 months ago

    Yes. We should move it to Montana

Still have questions? Get your answers by asking now.