Has Hollywood Been Humbled?
Hollywood journos are admitting that Hollywood LOST “the culture war” … and they’re very, very salty about it.
But of course, they don’t blame themselves at all. No, no, no. It’s because their peers rolled over for conservatives or something. Here’s more from Clownfish TV.
Has Hollywood been humbled? After fighting President Donald Trump for a decade, have the Tinseltown leftists finally taken it in the shorts? Are they going to mouth off less and try to get along more? Does anyone care any longer what happens to any of them?
Here’s more from Nerdrotic, focusing on how woke Hollywood is finally bending the knee.


