Understanding Aisha’s Age: An Interdisciplinary Approach

0
2764

Abstract

In recent years, few criticisms of Islam have taken the spotlight as much as condemnations of the Prophet’s marriage to Aisha. Muslims are accused of following the example of a man who had inappropriate relations with a 9-year-old girl. As a result, this has led many to doubt their faith and the moral compass it provides. However, this criticism is based on fallacious reasoning. When reviewing the available evidence, we not only find that early marriage was normal in many early societies, it also made moral sense given their circumstances. Throughout human history, populations had to adapt to their physical and social environments while optimizing their ethical judgments accordingly—much as we do today. This paper elucidates the flawed nature of accusations of the Prophet’s alleged immorality as well as how Islam teaches us to adapt the message of the Qur’an to changing circumstances.

Introduction

In 2014 the Pew Research Center estimated that roughly 57,800 minors (i.e., individuals under 18) were legally married in the United States. Of those marriages, 55% were between an underage girl and an adult man.[1] And while these numbers vary across the nation, in some states the rates are much higher. This includes California, which has recently been entangled in a legal drama over whether an age limit for marriage with parental consent should be established. Influential organizations like Planned Parenthood and the ACLU have been hostile to any proposed changes by legislatures and have thus far been successful in removing any amendments that would place restrictions on juveniles being able to marry with parental agreement. In other words, California currently considers child marriage permissible as long as the child’s parents agree.[2] Likewise, France is currently debating whether or not it should establish an age of consent. The country has had no set legal age up to this point, which has led to a significant number of acquittals for men accused of raping a minor (as young as and even younger than the age of 11).[3] These cases are odd given the United States’ and France’s apparent support for the Universal Declaration of Human Rights (UDHR) and its subsequent agreements, including the Convention on Consent to Marriage, Minimum Age for Marriage, and Registrations of Marriage (1964), which stipulates that:

Parties to the present Convention shall take legislative action to specify a minimum age for marriage. No marriage shall be legally entered into by any person under this age, except where a competent authority has granted a dispensation as to age, for serious reasons, in the interest of the intending spouses.[4]

This is especially disconcerting, considering the ways in which children are exploited and abused by these practices in the contemporary period. Young girls are the most vulnerable to the consequences of early marriage, which not only limits their social, educational, and economic opportunities but exposes them to health risks due to early pregnancy along with psychological and emotional trauma.[5] How can a society opposed to the exploitation of children allow such practices to exist? And what sort of message is being conveyed through the legal support of such a practice? In an age of the ever-growing phenomenon of child sex trafficking and pornography on the internet, this is especially concerning. For example, just this year, German law enforcement uncovered an online child pornography ring with a membership of nearly 90,000 users. Only a handful of them have actually been arrested.[6] 

Given this reality, it is unsurprising that the well-being and protection of children continues to be one of our greatest concerns, as well as a very sensitive topic. However, while concerns and sensitivities are undoubtedly warranted, they can sometimes lead us to make rash judgments about past communities—judgments outside the realm of established scientific fact and reason. This is no better exemplified than in what might be considered the most popular criticism of Islam today: the marriage of the Prophet Muhammad ﷺ and Aisha.

A Narrow View of Time

It’s impossible these days to look for information on Islam without being bombarded by warnings about the “dangers” of the religion. Whether the topic is about how Islam supposedly promotes terrorism or  how a minority population seeks world domination by deceiving people through halal meat and curry, faux experts from around the world spare no effort in demonizing a faith spanning 14 centuries and around 1.6 billion followers. However, the easiest way to do this is by appealing to the protective instincts of parents everywhere through presenting Islamic sources detailing the age of the Prophet Muhammad’s ﷺ youngest wife on the day of their marriage:

Narrated by Aisha: The Prophet ﷺ married me when I was six years old and consummated our marriage when I was nine years old. Then I remained with him for nine years (i.e., until his death).[7]

This narration has triggered both indignation and doubt about the moral integrity of the Islamic faith. How could an adult man—declared a moral exemplar among his followers—marry a child? Such questions have resulted in people either dismissing Islamic primary sources as inauthentic or condemning Islamic morals altogether as barbaric. Some Muslims have become so traumatized by the moral implications of these traditions that they’ve argued that the hadiths about Aisha’s age are spurious and have offered in their stead convoluted rationalizations that she was far older when she married (i.e., 18 years of age).[8] 

While such reactions seem valid in the context of our 21st-century, Western experiences, they make little sense when discussing the circumstances of people who lived more than a millennium ago. It is far easier to condemn 7th-century desert nomads as “barbarians” than for us to comprehend that our moral judgments are as much a function of our environment as the judgments of our ancestors.

Realizing this means recognizing how often we succumb to a fallacious form of reasoning known as presentism—an anachronistic misinterpretation of history based on present-day circumstances that did not exist in the past.[9] This is a very common mistake made by historians and laypersons alike. However, complex issues almost never come with such easy answers, no matter how high our expectations may be. More often than not, historical realities take time and effort to understand. This is especially the case when we allow for false ideas to become popular sentiment, forcing us to wade through pre-existing biases. This struggle has come to be referred as Brandolini’s Law, named after Alberto Brandolini, an Italian computer programmer who invented the now famous maxim: “The amount of energy needed to refute [nonsense] is an order of magnitude bigger than that needed to produce it.”[10]

That said, moral judgments can still be made about past people and events. Murder is still murder, theft is still theft, and rape is still rape, no matter the time or place. But how we judge situations of murder, theft, and rape depends on the contexts in which they were committed. For instance, it’s one thing to read about how a historical figure killed another person, but it’s another to know that they did so due to dire need or just cause (e.g., self-defense, war, corporal punishment, etc.). And determining those contexts isn’t always easy, especially when they are so dissimilar to our own. In other words, when studying history, things aren’t always as they appear.


Likewise, when we examine the scientific evidence regarding human development, maturity, and marriage in the past, what we find is a context that not only dispels the moral outrage regarding the marriage between the Prophet Muhammad ﷺ and Aisha, but also allows us to appreciate our ancestors for their struggles; for without them we would not be having this discussion today.

Aisha’s Lived Experience

The story of human development has gone through many phases. Empires have risen and fallen, plagues have burned through entire populations, droughts have starved generations, and natural disasters have buried the most advanced metropolises—a testimony to the fragility of human civilization. Yet, despite all of these trials and tribulations, we are still here, struggling and adapting to the ever-changing conditions of our existence. How we were able to get to this point is a long and complex tale spanning millennia, but one of many reasons may be related to the flexibility of our reproductive capabilities. The ways in which our ancestors have defined childhood, maturity, and marriage have been diverse and quite different from contemporary Western definitions.  

Those who hold to the notion that we are morally superior to our ancestors attribute this dissimilarity to historical societies’ ignorance about physical and psychosocial maturity or nefarious intentions to abuse and take advantage of children. However, it is an extraordinary and unsubstantiated accusation that most of our ancestors were unaware of how to care for their own children, were not concerned about their children’s well-being, had ill intentions, or suffered from a worldwide mental disorder (i.e., pedophilia)this accusation is easily contradicted by scientific and historical evidence. While it may seem impossible to us that a nine-year-old could be capable of anything other than going to school and engaging in play, this is only because we mistakenly assume that children’s circumstances and capabilities have remained static throughout history.

For example, today we expect our children to go through several years of primary and secondary education, and at least four years of university to provide them with economic and social opportunities. And this is a perfectly rational expectation, given an average global life expectancy of over 70 years[11] along with the increasing complexities of the global world. However, no such conditions existed 1400 years ago. While people in the past sometimes did reach older ages, this was not the norm. Case in point, the average life expectancy for a working-class Roman citizen in late antiquity was roughly around 35 to 40 years—if they lived past infancy.[12] Skeletal remains reveal that prior to death, most laborers suffered from chronic arthritis, fractures, displacements, and even bone cancer. This was due to their very poor diets—primarily stale bread, rotted grains, and little protein—and harsh working conditions.[13] And if they didn’t die from their work, they still had to contend with war, disease, and famine.

The female half of society didn’t have it any easier. The average life expectancy of women was between 34.5-37.5 years if they managed to live past infancy.[14] Due to high rates of infant mortality, women had to endure 5 to 7 full-term pregnancies just to keep the population stable.[15] Couple this with high maternal mortality during childbirth—due to iron deficiency resulting from a combination of continuous pregnancies and poor diet—and you have an extremely fragile situation. Given these high mortality rates, it made sense to begin procreating as early as possible.[16] In more affluent families, marrying young also guaranteed the maintenance and acquisition of wealth, securing the future of the family inheritance through a kind of business merger.[17] Likewise, political elites took advantage of early marriage to establish alliances between opponents; an expedient alternative to war. This is why the average age of marriage for young girls in ancient Rome was around 14/15, with the legal minimum being 12.[18] Even so, the Romans didn’t consider the age of marriage synonymous with the age of consent for sexual relations, which could be as young as seven.[19] 

Thus, working-class children who were fortunate enough to survive infancy had only a little over two decades left to establish the next generation, with nearly half of them losing a parent by the age of 15.[20] This was especially the case for young girls, who at the onset of puberty were expected to transition from childhood directly into adulthood. In other words, there were no family vacations, no recesses, no girl scouts, no school field trips, no sweet sixteen, no prom, no graduation, no air-conditioned movie theatres, no gluten-free meals at overstocked supermarkets, no advanced healthcare facilities, no vaccines, no running water, and subsequently far fewer guarantees that one would survive to see the next morning. And if this was the situation for common people in the most advanced civilization at the time, what more could we possibly expect from desert-dwelling Arabs? Although there is little to no data on Arab marriage practices in late antiquity, given a lack of written records,[21] we do have sufficient documentation of other Semitic cultures during this time. For example, historian Amram Tropper notes the realities of Jewish youth—especially females—in late antiquity:

Most men would have married sufficiently late that we would no longer consider them to have been children, yet many women (particularly in Babylonia) married so young that today we would consider them to have been girls, not women. The goal of maximizing fertility in particular must have lowered the age at first marriage and the price of this goal is the early, we might say premature, end of girlhood. For many girls, adolescence was not a time for fun, education, experimentation or professional training, rather it was a time when one was already expected to assume the full responsibilities of a mature woman, as wife and mother.[22]

The rationale behind maximizing fertility was really something no one could argue against considering the likelihood of young women not living long enough to see their first child reach maturity. When looking into history, we tend to forget many of these notable challenges of our ancestors’ lives and take our own advantages for granted. If you knew that you probably wouldn’t live beyond your 30s, most of your children would die in infancy, and the only education you would receive would be for one of a handful of jobs consisting of hard labor, wouldn’t your plans for life change dramatically? Of course they would. Not only that, but such circumstances would also force you to make moral decisions that you thought you would never need to make; decisions that, in hindsight, were necessary and morally appropriate. This is precisely why bioarchaeologists like Mary Lewis have warned against anachronistic thinking when discussing the subject of childhood and maturity in the past:

No matter what period we are examining, childhood is more than a biological age, but a series of social and cultural events and experiences that make up a child’s life…The time at which these transitions take place varies from one culture to another, and has a bearing on the level of interaction children have with their environment, their exposure to disease and trauma, and their contribution to the economic status of their family and society. The Western view of childhood, where children do not commit violence and are asexual, has been challenged by studies of children that show them learning to use weapons or being depicted in sexual poses…What is clear is that we cannot simply transpose our view of childhood directly onto the past.[23]

Because presentism is such a pervasive fallacy, even scientists themselves have been prone to the error, often mistaking biological age with psychosocial fitness. In this respect, bioarchaeologists Sian Halcrow and Nancy Tayles have elucidated some of the obstacles facing research on human development in the past. In their investigations, they found that contemporary Western anachronisms often obstruct more objective analyses of the data:

Much of the tension in the investigation of age in the past arises from the assumption that we can link “biological” to “social” age…distinctions between the categories, particularly “child” cf. “adult,” are the product of the current limitations of osteological methods for age estimation in adults, and that using biological developmental standards for ageing results in the construction of artificial divisions of social and mental development between these categories…Also, in contrast to modern Western society where social age is closely linked to chronological age, in many “traditional” societies, stages of maturation are acknowledged in defining age…These stages take into account not only the chronological age but also the skills, personality and capacities of the individual.[24]

Perhaps the most relevant example of how presentism negatively affects our understanding of the past can be seen in contemporary moral judgments regarding the Prophet’s youngest wife Aisha (ra). The idea that her marriage was contracted at the age of six and ultimately consummated by nine is seen as an affront by most people. However, when considering the aforementioned evidences, it shouldn’t be so difficult to understand why this practice was perfectly acceptable at the time. Aisha (ra) was merely following in the footsteps of so many girls before her who had reached puberty and were ready to start their adult lives. She herself states that she had reached maturity prior to her marriage:

Narrated Aisha (ra): I had seen my parents following Islam since I attained the age of reason [i.e., puberty]. Not a day passed, but the Prophet ﷺ visited us, both in the mornings and evenings.[25]

What this hadith states is clear if one is aware of the context surrounding it. Aisha (ra) was born in 614 CE and was the daughter of the Prophet’s closest companion, Abu Bakr as-Siddiq—a wealthy merchant who was among the first Muslims and who would eventually become the first caliph. Thus, she lived a rather privileged life in comparison to other children around her. However, in 622 CE, after suffering years of religious persecution at the hands of the pagans in Mecca, she and her family decided to migrate to a safe haven in the neighboring city of Medina. Upon their arrival, Aisha’s (ra) parents set up a temporary residence where she eventually came down with a fever (possibly due to being weakened by the long and arduous journey prior).[26] It was around this same time that the Prophet ﷺ was visiting them “both in the mornings and evenings,” and when she began to notice her parent’s outward expression of faith. Shortly thereafter, Aisha (ra) would consummate her marriage with the Prophet ﷺ and move into his household, completing the marriage contract as a full-fledged woman.[27] 

The fact that she was nine years of age when she reached puberty should not be surprising, especially given recent studies that have found that the onset of puberty has fluctuated dramatically throughout history. Case in point, while it would have been normal for a young girl to start puberty at around 14 years of age during the Western Industrial Revolution (18th–19th C.), in the 21st century some girls start puberty as early as six.[28] The reasons for these fluctuations are still largely undetermined, although they have been connected to variances in genetics, nutrition, stress, and even the over-sexualization of Western societies.[29] 

However, one may rightfully retort that just because a young girl has begun the process of physically maturing, this does not necessitate that she therefore possesses an adult mentality; to suggest otherwise would be considered absurd by contemporary standards. And that’s a very appropriate conclusion to come to considering that, even by today’s standards, we don’t necessarily regard legally acknowledged adults as independent and functioning members of society; they still need time to learn and experience the world before being considered cognitively and emotionally mature. There’s a reason that 18-year-olds still largely rely on their parents for economic support, despite the law defining them as ‘mature.’

That said, our ancestors faced very different circumstances to which they had to adapt—circumstances that determined their physical and psychosocial fitness. In this regard, endocrinologists Peter Gluckman and Mark Hanson have emphatically stated that the mismatch between biological and psychosocial maturation is a relatively recent phenomenon:

For the first time in our evolutionary history, biological puberty in females significantly precedes, rather than being matched to, the age of successful functioning as an adult. This mismatch between the age of biological and psychosocial maturation constitutes a fundamental issue for modern society. Our social structures have been developed in the expectation of longer childhood, prolonged education and training, and later reproductive competence. This emerging mismatch creates fundamental pressures on contemporary adolescents and on how they live in society.[30]

So while it is certainly true that the onset of puberty does not make someone an adult—in the current context—this same judgment does not apply to people of the past. By indulging in presentism, we disregard the facts of how our ancestors were forced to live just to survive. Furthermore, we open ourselves to intellectual embarrassment by misinterpreting history.

The most obvious manifestation of this fallacy can be seen when examining contemporary interpretations of some notable ahadith on the life of Aisha (ra). For example, many anti-Islam websites love to quote the following narration when arguing that Aisha (ra) was not mature enough to be married:

Narrated Aisha (ra): I used to play with dolls in the presence of the Prophet ﷺ, and my girlfriends also used to play with me. When Allah’s Apostle ﷺ used to enter (my dwelling place) they used to hide themselves, but the Prophet ﷺ would call them to join and play with me. [The playing with the dolls and similar images is forbidden, but it was allowed for Aisha (ra) at that time, as she was a little girl, not yet reached the age of puberty].[31] 

Many people assume that since Aisha (ra) was playing with dolls, she must have still been a child at the time of this narration. Prior to addressing the implication that playing with dolls equates to lacking maturity, what is immediately noticeable about this hadith is the statement in brackets (i.e., “…a little girl, not yet reached the age of puberty”). However, there is a glaring problem with the way this hadith is presented. For those thinking this a clear affirmation that she was a child, the fact of the matter is that the last statement is nowhere to be found in the hadith itself; rather, it is an addition from a hadith commentary called Fath al-Bari fi Sharh Sahih Bukhari, authored by the famous hadith scholar Ibn Hajar al-Asqalani (d. 1449 CE). This is important to note because it’s not made apparent in the hadith itself. The fact that some translators of the hadith have decided to include this is also telling. For what reason did they put this commentary in the hadith? And why would Ibn Hajar claim that Aisha (ra) hadn’t reached puberty? In order to answer these questions, we need only refer to Al-Asqalani himself:

I [Ibn Hajar] say: To say with certainty, [that she was not yet at the age of puberty] is questionable, though it might possibly be so. This, because A’isha (ra) was a 14-year-old girl at the time of the Battle of Khaybar—either exactly 14 years old, or having just passed her 14th year, or approaching it. As for her age at the time of the Battle of Tabook, she had by then definitely reached the age of puberty. Therefore, the strongest view is that of those who said: “It was in Khaybar” [i.e., when she was not yet at the age of puberty], and made reconciliation [between the apparent contradictory rulings of the permissibility of dolls in particular and the prohibition of images in general]…[32]

This explanation by Ibn Hajar reveals a number of important points which run contrary to the initial impressions of the hadith. The first and most obvious issue with Ibn Hajar’s commentary is that he admits that Aisha (ra) was at least 14 years of age at the time this narration takes place, putting her well above the average age of the onset of puberty in the Near East during late antiquity (and even by today’s standards). This is most likely why Ibn Hajar felt his own conclusion was questionable. Despite his own doubts, however, he suggests she must have not reached puberty due to reasons completely unrelated to her actual biological or psychosocial maturity: it helped him to reconcile an apparent contradiction in her behavior with the legal prohibition of adults playing with dolls. However, what makes Ibn Hajar’s opinion even more tenuous is that his view was countered by other master scholars of hadith and Islamic jurisprudence, such as Imam al-Bayhaqi (d. 1066), who claimed that the prohibition was only declared after the events narrated in the hadith in question.[33] That aside, it was not uncommon for young women in the past to own and even play with dolls, as these objects would be among the very few possessions they had prior to marriage. Commenting on the interpretation of toys and similar objects from past societies and cultures, anthropologist Laurie Wilkie notes:

Highly valued toys and childhood objects can be curated well into adulthood and passed on to subsequent generations of children; therefore, artefacts found in the archaeological record may not adequately reflect the full range of material culture used and cherished by the users.[34]

However, many of these realities escape the mindset affected by presentism, placing one in the position of making inappropriate moral judgments about our ancestors and their lived experiences. The fact that just a cursory analysis of the aforementioned narration so easily exposes the erroneous assumptions about Aisha’s (ra) lack of maturity should be evidence enough of the fallaciousness of this form of reasoning. That said, even if one were to admit to the complexities of childhood and development over time, these realities appear to allude to moral relativism—the idea that moral principles are only valid given their specific time, place, or culture. However, this couldn’t be further from the truth.

An Exemplar in a Changing World

Not only has our perspective on history been skewed by the fallacy of presentism, but so has our understanding of morality. Today, many people seem to think that morality is absolute and that this implies that the circumstances in which moral decisions are made have remained static. However, this is false. But to claim the opposite extreme—that morality is relative—is also false. As in all complex problems, black-and-white conclusions tend to miss the mark. The reality is that one can validly hold unchanging moral principles while still believing in historically contingent moral dilemmas. In other words, there can be, and are, correct and incorrect choices for every conceivable moral issue, regardless of varying circumstances.

For example, when considering an immoral act like murder, or taking the life of a person unjustly, what constitutes murder depends entirely on the circumstances in which the killing took place. Was the person killed accidentally? Was it an act of self-defense? Or was it because of malicious intent? These are general questions that can be answered and judged in the same manner, regardless of time or place. However, the details are what make things interesting.

Imagine that you’ve been chosen to serve on a jury for a murder trial. Both the prosecutor and defense attorneys present their evidence, eyewitness testimonies, potential motives, criminal histories, etc. However, after hours of deliberation, you’re still confused. Then, suddenly, the prosecution presents a forensic DNA analysis that conclusively shows that the accused was not only at the scene of the murder (contrary to his alibi) but that the blood of the victim was found on his clothing. Guilty as charged.

Now, let’s take a similar murder trial, but from 1984—prior to the development of DNA profiling. In this instance, would it be morally unjustified for you or anyone else to declare the accused ‘guilty’ without the use of forensic evidence? Would it be reasonable to condemn the jurors, despite them not having access to such technology? According to those enchanted by presentism, every murder trial prior to 1984 must be immoral, despite people doing their best to safeguard society and implement justice with the options they had available.

A perhaps more relevant example can be found in contemporary age-of-consent laws across the world. Anyone younger than a legally stipulated minimum age is generally regarded as too incompetent or too vulnerable to consent to sexual or emotional relationships. Subsequently, adults who engage in sexual relations with minors are declared to be pedophiles or child molesters. However, if we recall the aforementioned evidences showcasing the vast differences in development and maturity over time, it would be utterly illogical to apply the parameters of legal consent today to past societies. Not only were our ancestors more prepared to consent to such relationships at younger ages, but their circumstances limited who they could conceivably consent to; lower lifespans and harsher environments didn’t give people many options—once one reached puberty, it was time to be an adult. In other words, our ancestors’ views on what constituted maturity were not tied to chronological age, but to other signs of development and competence.

To make this point more persuasive, we need only attempt another thought experiment. Let us imagine that we have a time machine (as in the film Back to the Future). With an understanding of morality firmly rooted in presentism, you assume that all you need to do is apply contemporary laws to the past so as to solve all our ancestors’ problems and improve the future. With this righteous intention in mind, you get into your DeLorean and go back 1400 years to the Arabian Peninsula. After you arrive, you manage to convince the natives of your moral superiority as they marvel at your powers to traverse time and space. As a result, these simple desert dwellers make you their leader and adopt your laws, patiently waiting until the age of 18 to be considered adults (to work, use transport, marry, raise a family, go to war, and take on other major responsibilities). All starts off well in your newly formed utopia of heightened moral consciousness. However, as the years go by, you notice that your newly enlightened population has begun to dwindle at an extremely fast pace. Puzzled by this, you investigate.

What you find is startling: not only has the average age at death remained intact but so have all the other trappings of late antiquity. Contrary to the native’s former laws and customs—when puberty was the mark of adulthood—you now have middle-aged “children” doing nothing but consuming the hard-earned resources of their elders and giving nothing back to society. Not only that, but you’ve forced these youth into a situation where they now only have an average of 17 years remaining to get married and raise families—most inevitably dying before their own children have reached legal majority.

This subsequently leads to a disproportionate ratio of minors to adults, leaving future generations in the hands of individuals legally incapable of performing basic societal tasks. In summary, the ultimate outcome of your social experiment would be a civilization paralyzed by its own laws and a population bound to become extinct through natural causes or a hostile takeover from neighboring tribes who had the sense to conscript their male members at earlier ages.

You may realize at this point that the judicial and cultural structures of the past weren’t necessarily the problem, but rather the conditions in which those customs manifested themselves. However, it’s too late—your claim to moral superiority has destroyed a once-flourishing society and the entire course of history has been altered as a result. Future generations have ceased to exist and you may have now even put your own existence in jeopardy.

Thankfully, you’re still alive and this is just a hypothetical scenario born from awesome 1980s pop science fiction. But it helps to illustrate that historical laws and customs were not always necessarily on the wrong side of the moral spectrum. What we need to understand is that many moral choices and customs of the past were merely a function of the circumstances people faced. Therefore, it is not fair to consider ourselves morally superior to our ancestors when we aren’t forced to make the decisions they had to make. Likewise, it wouldn’t be fair if our descendants judged us in the same light without regard for our own circumstances. In summary, presentism ultimately negates the past and undermines any and all reasonable moral judgments.

However, Islam neither negates the past nor undermines moral judgment, because intrinsic to the faith are concepts which manage to simultaneously support absolute moral principles and historically contingent circumstances. The first and most important of these is the idea that the Prophet Muhammad ﷺ is a perfect moral exemplar (uswatun hasana) for all times, places, and cultures. In other words, every statement or action the Prophet ﷺ ever performed is considered to have been the most appropriate response to the dilemmas he faced during his time and a standard from which we can learn and which we can apply to analogous situations in the future. This theological view not only implies that no one could have behaved better nor ever will, but also that there is an absolute moral standard that can be understood and followed, regardless of historically contingent circumstances. This is no less exemplified in Islamic jurisprudence itself (fiqh); a sophisticated legal tradition with a flexible methodology that adapts to changing circumstances.

Divine Law, Marriage, and Maturity

During the reign of the second caliph of Islam, Umar al-Khattab (ra), the punishment for theft was suspended in response to a catastrophic famine that claimed many lives. Realizing that his subjects were starving and needed to steal food in order to survive, Umar (ra) prohibited the punishment for the sake of the survival of his peoplean exemplary act of justice.[35] However, his decision was not arbitrary and came from principles inherent in Islamic law itself: istihsan (juristic preference) and maslahah mursalah (public interests).[36] While not all potential moral dilemmas are addressed in Islamic primary sources (the Qur’an and Sunnah), these principles are alluded to and allow for a considerable amount of independent reasoning when a moral issue is ambiguous (mujmal) or can only be ascertained within specific contexts.[37]

Although conventional wisdom assumes a Divine Law must be archaic and incapable of adapting to changing circumstances, Islam promotes a very different perspective: if certain moral dilemmas are contingent on historical circumstance, then the Creator of all existence would naturally formulate a moral code suitable to that reality. To suggest otherwise would be to limit God to one particular time, place, and culture—something clearly uncharacteristic of an Omniscient, Omnipotent, Transcendent Being. Thus, a concise definition of how Islam views law is ‘a system with unchanging moral principles, but flexible application.’ To see how this is possible, we need only examine how Muslim jurists derived rulings pertaining to marriage from the Qur’an and Sunnah, particularly the Prophet’s ﷺ relationship with Aisha (ra).

Starting with the primary source of Islamic jurisprudence, the Qur’an clearly sets a standard age for marriage, which excluded anyone outside of those parameters:

Test orphans until they reach marriageable age; then, if you find they have sound judgment, hand over their property to them. Do not consume it hastily before they come of age: if the guardian is well off he should abstain from the orphan’s property, and if he is poor he should use only what is fair. When you give them their property, call witnesses in; but God takes full account of everything you do. (Al-Qur’an, 4:6)

In other words, the Qur’an sets an age limit for marriage. But what exactly is that limit? The text remains ambiguous with regard to a specific number, but Muslim scholars, particularly in the field of Qur’anic exegesis (tafsir), already understood what was implied. For example, when we examine the commentary of the 14th-century Syrian exegete and jurist Ibn Kathir (d. 1373), we find that he elaborated on the consensus surrounding the nature of ‘marriageable age’ as not referring to a specific number, but a physical development—the age of puberty.[38] That said, there are still more nuances at play here with regard to marriage and maturity. Firstly, Islamic jurists identified two types of marriage: a contractual marriage and a consummated marriage. The former could be legally entered at any point in a person’s life and later be revoked through their own volition, regardless of whether they had obtained legal maturity or not.[39] However, such a marriage prohibited any intimate contact between the betrothed and would be comparable today to an engagement.[40] The latter form of marriage (or ‘full marriage’), however, required both parties to be physically capable of sexual relations given the logical and necessary implication that such a union would lead to this outcome.

Secondly, jurists also differentiated between two types of maturity: age of majority and age of physical maturity (i.e., balaghat). Although these two notions may appear similar and redundant in light of the former being marked by the onset of puberty (i.e., menarche or pubertal hair growth), jurists generally viewed physical signs of adulthood as just that—signs; not de facto evidence of reproductive functionality. In other words, while legal majority often coincided with the permissibility to engage in sexual relations, it was not always or necessarily the case. Even feminist critics of Islamic Law, such as Professor Judith Tucker, have recognized this nuance:

A marriage could be contracted before either party was ready for sexual intercourse, but a marriage could not be consummated until both bride and groom were physically mature. Such maturity was not equated with puberty (the marker of legal majority), but rather could be reached before its onset. For a girl, readiness for sexual intercourse was signaled in large part by her appearance, by whether or not she had become an “object of desire,” “fleshy” (samrna), or “buxom” (dakhmap), physical attributes that signified that she could now “endure intercourse.” Until such time, the marriage, although legally contracted, clearly lacked an essential element.[41]

When determining the physical maturity of an individual, jurists often relied on physical features, the most common being if the person in question actually looked like an adult. Many jurists even went so far as to declare an average age by which such physical maturity should be reached (i.e., 15-17). In other words, what determined maturity depended entirely on a society’s normative judgments of sexual attractiveness and functionality.[42] However, such nuance has been lost on Islamophobes, who in their utter desperation to impugn Islam and its followers, interpret certain passages of the Qur’an as condoning pedophilia or child abuse. For example, many critics often reference the following verse to bolster their accusations:

If you are in doubt, the period of waiting will be three months for those women who have ceased menstruation and for those who have not [yet] menstruated; the waiting period of those who are pregnant will be until they deliver their burden: God makes things easy for those who are mindful of Him. (Al-Qur’an, 65:4)

Critics infer from the above that there being a waiting period for girls who “have not yet menstruated” indicates that it is permissible to engage in sexual relations with such girls.[43] However, this is an invalid conclusion because it neglects the different types of marriages and maturities in Islamic law. Case in point, the fact that a girl had not yet reached menarche was only evidence that she had yet to manifest the usual signs pertaining to legal majority—not that she was physically immature. A girl could technically still be considered mature based on other physical features, such as her biological age. With regard to this particular possibility, the leading Central Asian 12th century jurist, Ali ibn Abu Bakr al- Marghinani (d. 1197), provided this legal context behind the above verse:

And similarly those who have attained puberty (balaghat) by age, but have not menstruated, based on the end of the verse [“And those who have not menstruated” (65:4)], meaning those who have reached puberty by age, but not by menstruation; [those who have attained puberty] by reaching the age of 15 years according to the opinion of both (Abu Yusuf and Muhammad ibn Hasan al-Shaybani) or 17 years according to the opinion of Abu Hanifah and Malik, but have not yet menstruated; when they divorce they observe a waiting period based on months as well.[44]

It should be clear at this point that had Islam allowed for the sexual exploitation of children, many of these nuances would not exist. Case in point, the Qur’an would have never provided clarifications on the types of women who have waiting periods or even mention a ‘marriageable age’ to begin with if it allowed for any woman, regardless of maturity, to engage in sexual relations. And had jurists permitted such acts they too would never have bothered to distinguish between girls who were physically mature and those who were not. More importantly however, had the Prophet ﷺ himself been perceived as promoting the exploitation of children, then said scholars would have simply considered the age of nine to be the only condition necessary for a young girl to be considered mature. However, the age of nine has never been mentioned as one of the conditions by which to judge maturity in the Islamic tradition. Rather, jurists derived a completely different understanding from the relationship between the Prophet ﷺ and Aisha (ra): that he had entered a contracted marriage with Aisha (ra) when she was six years of age, and then consummated the marriage after she had reached maturity three years later. Simple logical deduction led scholars to conclude that if Islam allowed for the abuse of children, then the Prophet ﷺ would not have needed to wait three full years before finalizing his marriage—but he did wait. He waited because he knew that to do otherwise would have caused harm to his wife, and one of the principle objectives (maqasid) of Islamic law is “the prohibition of subjecting oneself to harm (darar) or causing harm to others (dirar).”[45]

A cursory review of Islamic history shows that this principle has generally been applied when deciding a number of complex legal issues, especially with regard to marriage. One case during the era of the Mamluk Sultanate of Egypt (1250–1517 CE) is particularly noteworthy. In the year 1470 CE, a woman petitioned the grand qadi (judge) of Cairo to have her 12-year-old niece married off due to financial difficulties, as the young girl had no means of support after her parents had abandoned her three years prior. The grand qadi then delegated the case to his deputy, Ibn al-Ṣayrafī, who narrated the incident in his journal. After assessing the situation, al-Ṣayrafī had the girl married off to a soldier’s servant, hoping that it would resolve her precarious circumstances. However, given that she had not yet reached puberty, he made sure to include a clause in the contract prohibiting her husband from consummating the marriage until she had adequately matured. Unfortunately, her husband violated the agreement and the couple was subsequently divorced. The girl’s aunt then complained to the chief dawadar (an assistant to the sultan), Yashbak min Mahdī. Al-Ṣayrafī was eventually called forth by min Mahdi to explain why he had allowed such a young girl to be married. His answer was simple and to the point: “Because the Prophet ﷺ married Aisha (ra) when she was nine years old.” However, the dawadar was not satisfied with his response and a few days later ordered the ex-husband to be flogged 100 times and publicly humiliated as “an example to anyone who deflowers young girls.” Interestingly, al-Ṣayrafī agreed to the punishment on account of the husband’s disregard for the boundaries set in the contract.[46] 

What this incident showcases is that not only were the qadis concerned with the well-being of immature girls, but so too were higher government officials; both attempted to minimize any potential harm and punished those who inflicted harm on minors. Therefore, examining such examples (in conjunction with traditional Islamic teachings) offers a sharp contrast to the narrative that Islam supports the exploitation of children.

Conclusion

Due to the complex conditions of the contemporary period, young people not only have the option of waiting before engaging in intimate relationships, but should do so for the sake of minimizing any potential harm to their lives. When examining the marriage between the Prophet ﷺ and Aisha (ra), we not only find an example of this nuance being put into practice, but can also glean some of the Divine Wisdom for humanity—a moral code that anticipates the fluctuations of human development over time. By extension, it should be undeniable now that the Prophet Muhammad ﷺ was perfectly within his moral rights to marry and love Aisha (ra). Unfortunately, some Muslims have become ignorant of their own tradition and have succumbed to interpreting Islamic Law in an uncompromising ahistorical fashion, much the same as critics of Islam.

Likewise, Western nations have not helped to set the standard by focusing entirely on superficial age limits as determinants of maturity—all the while considering it socially acceptable for their own minors to engage in sexual relations as long as they are within the same age range. It’s difficult to take the Western ethos seriously when there is a significant demarcation between what constitutes maturity and the permissibility to have sex. You cannot, on the one hand, condemn the practice of child marriage, but at the same time think your own children are physically and emotionally mature enough to have intimate relationships. It simply doesn’t make any sense. A minor who decides to have sex is still a minor who decides to have sex, regardless if they choose an age-similar partner or not. Western culture sends mixed messages when it tells minors that they have the right to intimacy with those they are attracted to only as long as they refrain from potential partners legally recognized as adults. To think that such an arbitrary distinction would matter to a teenager with raging hormones—or be considered detrimental—is an absurdity, because a minor would face the same consequences with their peers as they would with adults (e.g., pregnancy, sexually transmitted diseases, domestic violence, exploitation, etc.). In other words, this is simply an inconsistent standard to follow. Thus, the anachronistic outrage towards the marriage of the Prophet ﷺ and Aisha (ra) appears nothing more than a vacuous display of virtue signaling born from an ignorance of science, history, morality, and Islam alike.

LEAVE A REPLY

Please enter your comment!
Please enter your name here