The Western media portrays Islam as a religion that enslaves and oppresses women. One comes the practice of female gential mutilation or female circumcicsion? This is practiced mainly in Africa by both Muslims and Christians. Other things that come to my mind is the treatment of women under the Taliban Afganistan where women were forbidden to work, denied an education and access to medical care. Many of these women had no male relatives to support them because they were killed during the various wars, they were left to beg in the street. The taliban used Islam to justify the treatment of women in this manner. When I read the Quran and from my discussions with other Muslims this is tribal customs, not Islamic. Another one is the laws of women in Saudi Arabia, where women are forbidden to drive, they cannot get an education or even life saving medical treatment without the permission of a male relative. Is this allowed under Islam or are these governments simply mingling their preIslamic tribal customs with Islam? One would think that if this treatment of women is against Islamic teaching that more Muslims and clerics would speak out against it. Why the silence? If I do decide to embrace Islam I would like to embrace the faith as it was intended to be, not tribal customs. I dont mean to offend I would just like to get some clarifcation.


رد مع اقتباس
مواقع النشر (المفضلة)