How often do I hear or read on blogs, "That's not true [fill in religion of your choice]. That's only culture/tradition."
I know Christians are sometimes slammed for daring to go into other nations in order to share the good news about Jesus. Critics often charge them with trying to change the culture, and while I admit we should never arrogantly feel our culture is superior to all others, I do believe most cultures could undergo some changes which would make them better. (Yes, I know my "better" may be different than your "better" therefore it's subjective. Still...)
I was reading the Spotlight article of the latest In Touch magazine and Linda Canup shared about the group Leeland doing their summer tour in Southeast Asia. They wanted to support the people in their fight against poverty -- not by just throwing a bunch of money at the problem, but helping communities become self-sufficient. Here is one illustration of someone changed not by Americans imposing their "western values" on a non-western country, but by maybe Someone else.
This is about the family of a rickshaw driver whom they met.
"In Hindu culture," he explains, "the wife cooks, but she eats last." However, after attending a biblical class on equality, the rickshaw driver went home, and when his wife started to walk away after cooking the meal, he called her back. They both sat down and ate together, equal amounts, as a family. The husband said, "I ate less than what I usually eat, but I've never been more full in my entire life."
So if God changes someone's heart to the extent that husbands are kinder and more equitable to their wives...that, in my opinion, is a good thing. Some cultural practices are worth changing. Especially by God.
Can you think of some cultural practices that need to go? Can you think of cultural practices that were changed for good or bad due to religious influence?