I’ll try to write this from a fairly neutral perspective, though given what I’ve seen so far on my travels and my personal thoughts towards this subject, that may be slightly difficult.
I’ve been really fortunate so far in life, I’ve travelled a hell of a lot in a fairly short space of time. As such, I’ve been blessed with being able to see some of the most beautiful places on the planet. But in being able to see places like this, I’ve also come to notice more and more, how the west has heavily influenced these distant countries.
In a lot of instances that’s not at all a bad thing, it’s allowed for things like mass production, employment etc… But at the same time, has (I believe), begun to damage a lot of traditions. I know this sounds fairly antiquated, it sounds like I’m saying all countries should stay as they were and never evolve.. But that’s not true! What I’m simply saying is, I think that the west has affected the east in some ways that we had no place to do so… I spent some time in Singapore during my trip over to notice that all the bars were very “British”, even the plug sockets in the walls were the typical 3 pin UK ports… Is that not a little strange?
I’ve always been of the opinion when travelling, that if you’re abroad, you shouldn’t have your typical meal for dinner. What’s the point in going thousands of miles, to simply sit in a McDonalds? Or eat fish and chips? If you’ve made all that effort to escape the UK, or the US I suppose.. Why not go the extra mile and try a dish you’ve never eaten before? This is purely an example, but it tends to be my theory for travelling. And that theory seems less and less plausible if every country I visit is becoming more and more like the country that I’ve attempted to escape in the first place?
My rant is most certainly not over, I just wanted to voice my opinion!