When it comes to the clash of cultures, there's no phrase that rings a more familiar bell in our minds and consciousness than "West vs East".
Now we all know that the West, most particularly the nation of United States of America, is the world's pioneer and center of pop culture, and has spread throughout all corners of the globe. Then came along the East, most notably Japan, whom not only adopted the West's model and standards of pop culture, but has created and shaped their very own and has since then became an eternal rival to the Anglosphere in consumer culture exporting as a secondary or runner-up competitor at worst and an established equal or sometimes even superior at best.
With all theatrics out of the way, when it comes to the endless comparisons and debates between the West and the East, especially America and Japan in the visual arts/pop culture/entertainment landscape, it's always easy for us to support the West over the East or simply that "USA is way better than [insert other country X]" for a variety of reasons including patriotism, however this is not always the case.
For instance, in my personal point of view, American entertainment and pop culture, or at least some major parts of it, has been declining for me and for many reasons I have been finding more fascination with Japanese pop culture and entertainment more than ever lately. I'm sure that's not the only example, but it's the most primary one I can think of when things are all stripped down to specific country vs specific country.
So the main question of this threads: What are forms of media and entertainment or in ANYTHING at all beyond, do you think the East is better than the West in right now? What are examples of things that the East (be it Japan or elsewhere in Asia) is more successful and popular than the West (be it USA or other countries in Europe) right now? Or which categories of things in the world right now do you like the West better than the East now?
Place your answers, gentlemen!