For more on this topic, see More Than Just a Number: Perspectives on the Age of Aisha (RA)

This publication was supervised by Jonathan A. C. Brown.

Abstract

In recent years, few criticisms of Islam have taken the spotlight as much as condemnations of the Prophet’s marriage to Aisha. Muslims are accused of following the example of a man who had inappropriate relations with a 9-year-old girl. As a result, this has led many to doubt their faith and the moral compass it provides. However, this criticism is based on fallacious reasoning. When reviewing the available evidence, we not only find that early marriage was normal in many early societies, it also made moral sense given their circumstances. Throughout human history, populations had to adapt to their physical and social environments while optimizing their ethical judgments accordingly—much as we do today. This paper elucidates the flawed nature of accusations of the Prophet’s alleged immorality as well as how Islam teaches us to adapt the message of the Qur’an to changing circumstances.

Introduction

In 2014 the Pew Research Center estimated that roughly 57,800 minors (i.e., individuals under 18) were legally married in the United States. Of those marriages, 55% were between an underage girl and an adult man.[1] And while these numbers vary across the nation, in some states the rates are much higher. This includes California, which has recently been entangled in a legal drama over whether an age limit for marriage with parental consent should be established. Influential organizations like Planned Parenthood and the ACLU have been hostile to any proposed changes by legislatures and have thus far been successful in removing any amendments that would place restrictions on juveniles being able to marry with parental agreement. In other words, California currently considers child marriage permissible as long as the child’s parents agree.[2] Likewise, France is currently debating whether or not it should establish an age of consent. The country has had no set legal age up to this point, which has led to a significant number of acquittals for men accused of raping a minor (as young as and even younger than the age of 11).[3] These cases are odd given the United States’ and France’s apparent support for the Universal Declaration of Human Rights (UDHR) and its subsequent agreements, including the Convention on Consent to Marriage, Minimum Age for Marriage, and Registrations of Marriage (1964), which stipulates that:

Parties to the present Convention shall take legislative action to specify a minimum age for marriage. No marriage shall be legally entered into by any person under this age, except where a competent authority has granted a dispensation as to age, for serious reasons, in the interest of the intending spouses.[4]

This is especially disconcerting, considering the ways in which children are exploited and abused by these practices in the contemporary period. Young girls are the most vulnerable to the consequences of early marriage, which not only limits their social, educational, and economic opportunities but exposes them to health risks due to early pregnancy along with psychological and emotional trauma.[5] How can a society opposed to the exploitation of children allow such practices to exist? And what sort of message is being conveyed through the legal support of such a practice? In an age of the ever-growing phenomenon of child sex trafficking and pornography on the internet, this is especially concerning. For example, just this year, German law enforcement uncovered an online child pornography ring with a membership of nearly 90,000 users. Only a handful of them have actually been arrested.[6] 
Given this reality, it is unsurprising that the well-being and protection of children continues to be one of our greatest concerns, as well as a very sensitive topic. However, while concerns and sensitivities are undoubtedly warranted, they can sometimes lead us to make rash judgments about past communities—judgments outside the realm of established scientific fact and reason. This is no better exemplified than in what might be considered the most popular criticism of Islam today: the marriage of the Prophet Muhammad ﷺ and Aisha.

A Narrow View of Time

It’s impossible these days to look for information on Islam without being bombarded by warnings about the “dangers” of the religion. Whether the topic is about how Islam supposedly promotes terrorism or  how a minority population seeks world domination by deceiving people through halal meat and curry, faux experts from around the world spare no effort in demonizing a faith spanning 14 centuries and around 1.6 billion followers. However, the easiest way to do this is by appealing to the protective instincts of parents everywhere through presenting Islamic sources detailing the age of the Prophet Muhammad’s ﷺ youngest wife on the day of their marriage:

Narrated by Aisha: The Prophet ﷺ married me when I was six years old and consummated our marriage when I was nine years old. Then I remained with him for nine years (i.e., until his death).[7]

This narration has triggered both indignation and doubt about the moral integrity of the Islamic faith. How could an adult man—declared a moral exemplar among his followers—marry a child? Such questions have resulted in people either dismissing Islamic primary sources as inauthentic or condemning Islamic morals altogether as barbaric. Some Muslims have become so traumatized by the moral implications of these traditions that they’ve argued that the hadiths about Aisha’s age are spurious and have offered in their stead convoluted rationalizations that she was far older when she married (i.e., 18 years of age).[8] 
While such reactions seem valid in the context of our 21st-century, Western experiences, they make little sense when discussing the circumstances of people who lived more than a millennium ago. It is far easier to condemn 7th-century desert nomads as “barbarians” than for us to comprehend that our moral judgments are as much a function of our environment as the judgments of our ancestors.
Realizing this means recognizing how often we succumb to a fallacious form of reasoning known as presentism—an anachronistic misinterpretation of history based on present-day circumstances that did not exist in the past.[9] This is a very common mistake made by historians and laypersons alike. However, complex issues almost never come with such easy answers, no matter how high our expectations may be. More often than not, historical realities take time and effort to understand. This is especially the case when we allow for false ideas to become popular sentiment, forcing us to wade through pre-existing biases. This struggle has come to be referred as Brandolini’s Law, named after Alberto Brandolini, an Italian computer programmer who invented the now famous maxim: “The amount of energy needed to refute [nonsense] is an order of magnitude bigger than that needed to produce it.”[10]
That said, moral judgments can still be made about past people and events. Murder is still murder, theft is still theft, and rape is still rape, no matter the time or place. But how we judge situations of murder, theft, and rape depends on the contexts in which they were committed. For instance, it’s one thing to read about how a historical figure killed another person, but it’s another to know that they did so due to dire need or just cause (e.g., self-defense, war, corporal punishment, etc.). And determining those contexts isn’t always easy, especially when they are so dissimilar to our own. In other words, when studying history, things aren’t always as they appear.
Likewise, when we examine the scientific evidence regarding human development, maturity, and marriage in the past, what we find is a context that not only dispels the moral outrage regarding the marriage between the Prophet Muhammad ﷺ and Aisha, but also allows us to appreciate our ancestors for their struggles; for without them we would not be having this discussion today.

Aisha’s Lived Experience

The story of human development has gone through many phases. Empires have risen and fallen, plagues have burned through entire populations, droughts have starved generations, and natural disasters have buried the most advanced metropolises—a testimony to the fragility of human civilization. Yet, despite all of these trials and tribulations, we are still here, struggling and adapting to the ever-changing conditions of our existence. How we were able to get to this point is a long and complex tale spanning millennia, but one of many reasons may be related to the flexibility of our reproductive capabilities. The ways in which our ancestors have defined childhood, maturity, and marriage have been diverse and quite different from contemporary Western definitions.  
Those who hold to the notion that we are morally superior to our ancestors attribute this dissimilarity to historical societies’ ignorance about physical and psychosocial maturity or nefarious intentions to abuse and take advantage of children. However, it is an extraordinary and unsubstantiated accusation that most of our ancestors were unaware of how to care for their own children, were not concerned about their children’s well-being, had ill intentions, or suffered from a worldwide mental disorder (i.e., pedophilia)this accusation is easily contradicted by scientific and historical evidence. While it may seem impossible to us that a nine-year-old could be capable of anything other than going to school and engaging in play, this is only because we mistakenly assume that children’s circumstances and capabilities have remained static throughout history.
For example, today we expect our children to go through several years of primary and secondary education, and at least four years of university to provide them with economic and social opportunities. And this is a perfectly rational expectation, given an average global life expectancy of over 70 years[11] along with the increasing complexities of the global world. However, no such conditions existed 1400 years ago. While people in the past sometimes did reach older ages, this was not the norm. Case in point, the average life expectancy for a working-class Roman citizen in late antiquity was roughly around 35 to 40 years—if they lived past infancy.[12] Skeletal remains reveal that prior to death, most laborers suffered from chronic arthritis, fractures, displacements, and even bone cancer. This was due to their very poor diets—primarily stale bread, rotted grains, and little protein—and harsh working conditions.[13] And if they didn’t die from their work, they still had to contend with war, disease, and famine.
The female half of society didn’t have it any easier. The average life expectancy of women was between 34.5-37.5 years if they managed to live past infancy.[14] Due to high rates of infant mortality, women had to endure 5 to 7 full-term pregnancies just to keep the population stable.[15] Couple this with high maternal mortality during childbirth—due to iron deficiency resulting from a combination of continuous pregnancies and poor diet—and you have an extremely fragile situation. Given these high mortality rates, it made sense to begin procreating as early as possible.[16] In more affluent families, marrying young also guaranteed the maintenance and acquisition of wealth, securing the future of the family inheritance through a kind of business merger.[17] Likewise, political elites took advantage of early marriage to establish alliances between opponents; an expedient alternative to war. This is why the average age of marriage for young girls in ancient Rome was around 14/15, with the legal minimum being 12.[18] Even so, the Romans didn’t consider the age of marriage synonymous with the age of consent for sexual relations, which could be as young as seven.[19] 
Thus, working-class children who were fortunate enough to survive infancy had only a little over two decades left to establish the next generation, with nearly half of them losing a parent by the age of 15.[20] This was especially the case for young girls, who at the onset of puberty were expected to transition from childhood directly into adulthood. In other words, there were no family vacations, no recesses, no girl scouts, no school field trips, no sweet sixteen, no prom, no graduation, no air-conditioned movie theatres, no gluten-free meals at overstocked supermarkets, no advanced healthcare facilities, no vaccines, no running water, and subsequently far fewer guarantees that one would survive to see the next morning. And if this was the situation for common people in the most advanced civilization at the time, what more could we possibly expect from desert-dwelling Arabs? Although there is little to no data on Arab marriage practices in late antiquity, given a lack of written records,[21] we do have sufficient documentation of other Semitic cultures during this time. For example, historian Amram Tropper notes the realities of Jewish youth—especially females—in late antiquity:

Most men would have married sufficiently late that we would no longer consider them to have been children, yet many women (particularly in Babylonia) married so young that today we would consider them to have been girls, not women. The goal of maximizing fertility in particular must have lowered the age at first marriage and the price of this goal is the early, we might say premature, end of girlhood. For many girls, adolescence was not a time for fun, education, experimentation or professional training, rather it was a time when one was already expected to assume the full responsibilities of a mature woman, as wife and mother.[22]

The rationale behind maximizing fertility was really something no one could argue against considering the likelihood of young women not living long enough to see their first child reach maturity. When looking into history, we tend to forget many of these notable challenges of our ancestors’ lives and take our own advantages for granted. If you knew that you probably wouldn’t live beyond your 30s, most of your children would die in infancy, and the only education you would receive would be for one of a handful of jobs consisting of hard labor, wouldn’t your plans for life change dramatically? Of course they would. Not only that, but such circumstances would also force you to make moral decisions that you thought you would never need to make; decisions that, in hindsight, were necessary and morally appropriate. This is precisely why bioarchaeologists like Mary Lewis have warned against anachronistic thinking when discussing the subject of childhood and maturity in the past:

No matter what period we are examining, childhood is more than a biological age, but a series of social and cultural events and experiences that make up a child’s life...The time at which these transitions take place varies from one culture to another, and has a bearing on the level of interaction children have with their environment, their exposure to disease and trauma, and their contribution to the economic status of their family and society. The Western view of childhood, where children do not commit violence and are asexual, has been challenged by studies of children that show them learning to use weapons or being depicted in sexual poses...What is clear is that we cannot simply transpose our view of childhood directly onto the past.[23]

Because presentism is such a pervasive fallacy, even scientists themselves have been prone to the error, often mistaking biological age with psychosocial fitness. In this respect, bioarchaeologists Sian Halcrow and Nancy Tayles have elucidated some of the obstacles facing research on human development in the past. In their investigations, they found that contemporary Western anachronisms often obstruct more objective analyses of the data:

Much of the tension in the investigation of age in the past arises from the assumption that we can link “biological” to “social” age…distinctions between the categories, particularly “child” cf. “adult,” are the product of the current limitations of osteological methods for age estimation in adults, and that using biological developmental standards for ageing results in the construction of artificial divisions of social and mental development between these categories…Also, in contrast to modern Western society where social age is closely linked to chronological age, in many “traditional” societies, stages of maturation are acknowledged in defining age...These stages take into account not only the chronological age but also the skills, personality and capacities of the individual.[24]

Perhaps the most relevant example of how presentism negatively affects our understanding of the past can be seen in contemporary moral judgments regarding the Prophet’s youngest wife Aisha (ra). The idea that her marriage was contracted at the age of six and ultimately consummated by nine is seen as an affront by most people. However, when considering the aforementioned evidences, it shouldn’t be so difficult to understand why this practice was perfectly acceptable at the time. Aisha (ra) was merely following in the footsteps of so many girls before her who had reached puberty and were ready to start their adult lives. She herself states that she had reached maturity prior to her marriage:

Narrated Aisha (ra): I had seen my parents following Islam since I attained the age of reason [i.e., puberty]. Not a day passed, but the Prophet ﷺ visited us, both in the mornings and evenings.[25]

What this hadith states is clear if one is aware of the context surrounding it. Aisha (ra) was born in 614 CE and was the daughter of the Prophet’s closest companion, Abu Bakr as-Siddiq—a wealthy merchant who was among the first Muslims and who would eventually become the first caliph. Thus, she lived a rather privileged life in comparison to other children around her. However, in 622 CE, after suffering years of religious persecution at the hands of the pagans in Mecca, she and her family decided to migrate to a safe haven in the neighboring city of Medina. Upon their arrival, Aisha’s (ra) parents set up a temporary residence where she eventually came down with a fever (possibly due to being weakened by the long and arduous journey prior).[26] It was around this same time that the Prophet ﷺ was visiting them “both in the mornings and evenings,” and when she began to notice her parent’s outward expression of faith. Shortly thereafter, Aisha (ra) would consummate her marriage with the Prophet ﷺ and move into his household, completing the marriage contract as a full-fledged woman.[27] 
The fact that she was nine years of age when she reached puberty should not be surprising, especially given recent studies that have found that the onset of puberty has fluctuated dramatically throughout history. Case in point, while it would have been normal for a young girl to start puberty at around 14 years of age during the Western Industrial Revolution (18th–19th C.), in the 21st century some girls start puberty as early as six.[28] The reasons for these fluctuations are still largely undetermined, although they have been connected to variances in genetics, nutrition, stress, and even the over-sexualization of Western societies.[29] 
However, one may rightfully retort that just because a young girl has begun the process of physically maturing, this does not necessitate that she therefore possesses an adult mentality; to suggest otherwise would be considered absurd by contemporary standards. And that’s a very appropriate conclusion to come to considering that, even by today’s standards, we don’t necessarily regard legally acknowledged adults as independent and functioning members of society; they still need time to learn and experience the world before being considered cognitively and emotionally mature. There’s a reason that 18-year-olds still largely rely on their parents for economic support, despite the law defining them as ‘mature.’
That said, our ancestors faced very different circumstances to which they had to adapt—circumstances that determined their physical and psychosocial fitness. In this regard, endocrinologists Peter Gluckman and Mark Hanson have emphatically stated that the mismatch between biological and psychosocial maturation is a relatively recent phenomenon:

For the first time in our evolutionary history, biological puberty in females significantly precedes, rather than being matched to, the age of successful functioning as an adult. This mismatch between the age of biological and psychosocial maturation constitutes a fundamental issue for modern society. Our social structures have been developed in the expectation of longer childhood, prolonged education and training, and later reproductive competence. This emerging mismatch creates fundamental pressures on contemporary adolescents and on how they live in society.[30]

So, while it is certainly true that the onset of puberty does not make someone an adult today, this same judgment does not apply to people of the past. By indulging in presentism, we disregard the facts of how our ancestors were forced to live just to survive. Furthermore, we open ourselves to intellectual embarrassment by misinterpreting history.
The most obvious manifestation of this fallacy can be seen when examining contemporary interpretations of some notable hadiths on the life of Aisha (ra). For example, many anti-Islam websites love to quote the following narration when arguing that Aisha (ra) was not mature enough to be married:

Narrated Aisha (ra): I used to play with dolls in the presence of the Prophet ﷺ, and my girlfriends also used to play with me. When Allah's Apostle ﷺ used to enter (my dwelling place) they used to hide themselves, but the Prophet ﷺ would call them to join and play with me. [The playing with the dolls and similar images is forbidden, but it was allowed for Aisha (ra) at that time, as she was a little girl, not yet reached the age of puberty].[31] 

Many people assume that since Aisha (ra) was playing with dolls, she must have still been a child at the time of this narration. Prior to addressing the implication that playing with dolls equates to lacking maturity, what is immediately noticeable about this hadith is the statement in brackets (i.e., “…a little girl, not yet reached the age of puberty”). However, there is a glaring problem with the way this hadith is presented. For those thinking this a clear affirmation that she was a child, the fact of the matter is that the last statement is nowhere to be found in the hadith itself; rather, it is an addition from a hadith commentary called Fath al-Bari fi Sharh Sahih Bukhari, authored by the famous hadith scholar Ibn Hajar al-Asqalani (d. 1449 CE). This is important to note because it’s not made apparent in the hadith itself. The fact that some translators of the hadith have decided to include this is also telling. For what reason did they put this commentary in the hadith? And why would Ibn Hajar claim that Aisha (ra) hadn’t reached puberty? In order to answer these questions, we need only refer to Al-Asqalani himself:

I [Ibn Hajar] say: To say with certainty, [that she was not yet at the age of puberty] is questionable, though it might possibly be so. This, because A’isha (ra) was a 14-year-old girl at the time of the Battle of Khaybar—either exactly 14 years old, or having just passed her 14th year, or approaching it. As for her age at the time of the Battle of Tabook, she had by then definitely reached the age of puberty. Therefore, the strongest view is that of those who said: “It was in Khaybar” [i.e., when she was not yet at the age of puberty], and made reconciliation [between the apparent contradictory rulings of the permissibility of dolls in particular and the prohibition of images in general]...[32]

This explanation by Ibn Hajar reveals a number of important points which run contrary to the initial impressions of the hadith. The first and most obvious issue with Ibn Hajar’s commentary is that he admits that Aisha (ra) was at least 14 years of age at the time this narration takes place, putting her well above the average age of the onset of puberty in the Near East during late antiquity (and even by today’s standards). This is most likely why Ibn Hajar felt his own conclusion was questionable. Despite his own doubts, however, he suggests she must have not reached puberty due to reasons completely unrelated to her actual biological or psychosocial maturity: it helped him to reconcile an apparent contradiction in her behavior with the legal prohibition of adults playing with dolls. However, what makes Ibn Hajar’s opinion even more tenuous is that his view was countered by other master scholars of hadith and Islamic jurisprudence, such as Imam al-Bayhaqi (d. 1066), who claimed that the prohibition was only declared after the events narrated in the hadith in question.[33] That aside, it was not uncommon for young women in the past to own and even play with dolls, as these objects would be among the very few possessions they had prior to marriage. Commenting on the interpretation of toys and similar objects from past societies and cultures, anthropologist Laurie Wilkie notes:

Highly valued toys and childhood objects can be curated well into adulthood and passed on to subsequent generations of children; therefore, artefacts found in the archaeological record may not adequately reflect the full range of material culture used and cherished by the users.[34]

However, many of these realities escape the mindset affected by presentism, placing one in the position of making inappropriate moral judgments about our ancestors and their lived experiences. The fact that just a cursory analysis of the aforementioned narration so easily exposes the erroneous assumptions about Aisha’s (ra) lack of maturity should be evidence enough of the fallaciousness of this form of reasoning. That said, even if one were to admit to the complexities of childhood and development over time, these realities appear to allude to moral relativism—the idea that moral principles are only valid given their specific time, place, or culture. However, this couldn’t be further from the truth.

An Exemplar in a Changing World

Not only has our perspective on history been skewed by the fallacy of presentism, but so has our understanding of morality. Today, many people seem to think that morality is absolute and that this implies that the circumstances in which moral decisions are made have remained static. However, this is false. But to claim the opposite extreme—that morality is relative—is also false. As in all complex problems, black-and-white conclusions tend to miss the mark. The reality is that one can validly hold unchanging moral principles while still believing in historically contingent moral dilemmas. In other words, there can be, and are, correct and incorrect choices for every conceivable moral issue, regardless of varying circumstances.
For example, when considering an immoral act like murder, or taking the life of a person unjustly, what constitutes murder depends entirely on the circumstances in which the killing took place. Was the person killed accidentally? Was it an act of self-defense? Or was it because of malicious intent? These are general questions that can be answered and judged in the same manner, regardless of time or place. However, the details are what make things interesting.
Now, let’s take a similar murder trial, but from 1984—prior to the development of DNA profiling. In this instance, would it be morally unjustified for you or anyone else to declare the accused ‘guilty’ without the use of forensic evidence? Would it be reasonable to condemn the jurors, despite them not having access to such technology? According to those enchanted by presentism, every murder trial prior to 1984 must be immoral, despite people doing their best to safeguard society and implement justice with the options they had available.
A perhaps more relevant example can be found in contemporary age-of-consent laws across the world. Anyone younger than a legally stipulated minimum age is generally regarded as too incompetent or too vulnerable to consent to sexual or emotional relationships. Subsequently, adults who engage in sexual relations with minors are declared to be pedophiles or child molesters. However, if we recall the aforementioned evidences showcasing the vast differences in development and maturity over time, it would be utterly illogical to apply the parameters of legal consent today to past societies. Not only were our ancestors more prepared to consent to such relationships at younger ages, but their circumstances limited who they could conceivably consent to; lower lifespans and harsher environments didn’t give people many options—once one reached puberty, it was time to be an adult. In other words, our ancestors’ views on what constituted maturity were not tied to chronological age, but to other signs of development and competence.
To make this point more persuasive, we need only attempt another thought experiment. Let us imagine that we have a time machine (as in the film Back to the Future). With an understanding of morality firmly rooted in presentism, you assume that all you need to do is apply contemporary laws to the past so as to solve all our ancestors’ problems and improve the future. With this righteous intention in mind, you get into your DeLorean and go back 1400 years to the Arabian Peninsula. After you arrive, you manage to convince the natives of your moral superiority as they marvel at your powers to traverse time and space. As a result, these simple desert dwellers make you their leader and adopt your laws, patiently waiting until the age of 18 to be considered adults (to work, use transport, marry, raise a family, go to war, and take on other major responsibilities). All starts off well in your newly formed utopia of heightened moral consciousness. However, as the years go by, you notice that your newly enlightened population has begun to dwindle at an extremely fast pace. Puzzled by this, you investigate.
What you find is startling: not only has the average age at death remained intact but so have all the other trappings of late antiquity. Contrary to the native’s former laws and customs—when puberty was the mark of adulthood—you now have middle-aged “children” doing nothing but consuming the hard-earned resources of their elders and giving nothing back to society. Not only that, but you’ve forced these youth into a situation where they now only have an average of 17 years remaining to get married and raise families—most inevitably dying before their own children have reached legal majority.
This subsequently leads to a disproportionate ratio of minors to adults, leaving future generations in the hands of individuals legally incapable of performing basic societal tasks. In summary, the ultimate outcome of your social experiment would be a civilization paralyzed by its own laws and a population bound to become extinct through natural causes or a hostile takeover from neighboring tribes who had the sense to conscript their male members at earlier ages.
You may realize at this point that the judicial and cultural structures of the past weren’t necessarily the problem, but rather the conditions in which those customs manifested themselves. However, it’s too late—your claim to moral superiority has destroyed a once-flourishing society and the entire course of history has been altered as a result. Future generations have ceased to exist and you may have now even put your own existence in jeopardy.
Thankfully, you’re still alive and this is just a hypothetical scenario born from awesome 1980s pop science fiction. But it helps to illustrate that historical laws and customs were not always necessarily on the wrong side of the moral spectrum. What we need to understand is that many moral choices and customs of the past were merely a function of the circumstances people faced. Therefore, it is not fair to consider ourselves morally superior to our ancestors when we aren’t forced to make the decisions they had to make. Likewise, it wouldn’t be fair if our descendants judged us in the same light without regard for our own circumstances. In summary, presentism ultimately negates the past and undermines any and all reasonable moral judgments.
However, Islam neither negates the past nor undermines moral judgment, because intrinsic to the faith are concepts which manage to simultaneously support absolute moral principles and historically contingent circumstances. The first and most important of these is the idea that the Prophet Muhammad ﷺ is a perfect moral exemplar (uswatun hasana) for all times, places, and cultures. In other words, every statement or action the Prophet ﷺ ever performed is considered to have been the most appropriate response to the dilemmas he faced during his time and a standard from which we can learn and which we can apply to analogous situations in the future. This theological view not only implies that no one could have behaved better nor ever will, but also that there is an absolute moral standard that can be understood and followed, regardless of historically contingent circumstances. This is no less exemplified in Islamic jurisprudence itself (fiqh); a sophisticated legal tradition with a flexible methodology that adapts to changing circumstances.

Divine Law, Marriage, and Maturity

During the reign of the second caliph of Islam, Umar al-Khattab (ra), the punishment for theft was suspended in response to a catastrophic famine that claimed many lives. Realizing that his subjects were starving and needed to steal food in order to survive, Umar (ra) prohibited the punishment for the sake of the survival of his peoplean exemplary act of justice.[35] However, his decision was not arbitrary and came from principles inherent in Islamic law itself: istihsan (juristic preference) and maslahah mursalah (public interests).[36] While not all potential moral dilemmas are addressed in Islamic primary sources (the Qur’an and Sunnah), these principles are alluded to and allow for a considerable amount of independent reasoning when a moral issue is ambiguous (mujmal) or can only be ascertained within specific contexts.[37]
Although conventional wisdom assumes a Divine Law must be archaic and incapable of adapting to changing circumstances, Islam promotes a very different perspective: if certain moral dilemmas are contingent on historical circumstance, then the Creator of all existence would naturally formulate a moral code suitable to that reality. To suggest otherwise would be to limit God to one particular time, place, and culture—something clearly uncharacteristic of an Omniscient, Omnipotent, Transcendent Being. Thus, a concise definition of how Islam views law is ‘a system with unchanging moral principles, but flexible application.’ To see how this is possible, we need only examine how Muslim jurists derived rulings pertaining to marriage from the Qur’an and Sunnah, particularly the Prophet’s ﷺ relationship with Aisha (ra).
Starting with the primary source of Islamic jurisprudence, the Qur’an clearly sets a standard age for marriage, which excludes anyone outside of those parameters:

Test orphans until they reach marriageable age; then, if you find they have sound judgment, hand over their property to them. Do not consume it hastily before they come of age: if the guardian is well off he should abstain from the orphan’s property, and if he is poor he should use only what is fair. When you give them their property, call witnesses in; but God takes full account of everything you do. (Al-Qur’an, 4:6)

In other words, the Qur’an sets an age limit for marriage. But what exactly is that limit? The text remains ambiguous with regard to a specific number, but Muslim scholars, particularly in the field of Qur’anic exegesis (tafsir), already understood what was implied. For example, when we examine the commentary of the 14th-century Syrian exegete and jurist Ibn Kathir (d. 1373), we find that he elaborated on the consensus surrounding the nature of ‘marriageable age’ as not referring to a specific number, but a physical development—the age of puberty.[38] That said, there are still more nuances at play here with regard to marriage and maturity. Firstly, Islamic jurists identified two types of marriage: a contractual marriage and a consummated marriage. The former could be legally entered at any point in a person’s life and later be revoked through their own volition, regardless of whether they had obtained legal maturity or not.[39] However, such a marriage prohibited any intimate contact between the betrothed and would be comparable today to an engagement.[40] The latter form of marriage (or ‘full marriage’), however, required both parties to be physically capable of sexual relations given the logical implication that such a union would lead to this outcome.

balaghat). Although these two notions may appear similar and redundant in light of the former being marked by the onset of puberty (i.e., menarche or pubertal hair growth), jurists generally viewed physical signs of adulthood as just that—signs; not de facto evidence of reproductive functionality. In other words, while legal majority often coincided with the permissibility to engage in sexual relations, it was not always or necessarily the case. Even feminist critics of Islamic Law, such as Professor Judith Tucker, have recognized this nuance:

A marriage could be contracted before either party was ready for sexual intercourse, but a marriage could not be consummated until both bride and groom were physically mature. Such maturity was not equated with puberty (the marker of legal majority), but rather could be reached before its onset. For a girl, readiness for sexual intercourse was signaled in large part by her appearance, by whether or not she had become an "object of desire," "fleshy" (samrna), or "buxom" (dakhmap), physical attributes that signified that she could now "endure intercourse.” Until such time, the marriage, although legally contracted, clearly lacked an essential element.[41]

When determining the physical maturity of an individual, jurists often relied on physical features, the most common being if the person in question actually looked like an adult. Many jurists even went so far as to declare an average age by which such physical maturity should be reached (i.e., 15-17). In other words, what determined maturity depended entirely on a society’s normative judgments of sexual attractiveness and functionality.[42] However, such nuance has been lost on Islamophobes, who in their utter desperation to impugn Islam and its followers, interpret certain passages of the Qur’an as condoning pedophilia or child abuse. For example, many critics often reference the following verse to bolster their accusations:

If you are in doubt, the period of waiting will be three months for those women who have ceased menstruation and for those who have not [yet] menstruated; the waiting period of those who are pregnant will be until they deliver their burden: God makes things easy for those who are mindful of Him. (Al-Qur’an, 65:4)

Critics infer from the above that there being a waiting period for girls who “have not yet menstruated” indicates that it is permissible to engage in sexual relations with prepubescent girls.[43] However, this is an invalid conclusion because it neglects the different types of marriages and maturities in Islamic law. Case in point, the fact that a girl had not yet reached menarche was only evidence that she had yet to manifest the usual signs pertaining to legal majority—not that she was physically immature. A girl could technically still be considered mature based on other physical features, such as her biological age. With regard to this particular possibility, the leading Central Asian 12th-century jurist, Ali ibn Abu Bakr al- Marghinani (d. 1197), provided this legal context behind the above verse:

And similarly those who have attained puberty (balaghat) by age, but have not menstruated, based on the end of the verse [“And those who have not menstruated” (65:4)], meaning those who have reached puberty by age, but not by menstruation; [those who have attained puberty] by reaching the age of 15 years according to the opinion of both (Abu Yusuf and Muhammad ibn Hasan al-Shaybani) or 17 years according to the opinion of Abu Hanifah and Malik, but have not yet menstruated; when they divorce they observe a waiting period based on months as well.[44]

It should be clear at this point that had Islam allowed for the sexual exploitation of children, many of these nuances would not exist. Case in point, the Qur’an would have never provided clarifications on the types of women who have waiting periods or even mention a ‘marriageable age’ to begin with if it allowed for any woman, regardless of maturity, to engage in sexual relations. And had jurists permitted such acts they too would never have bothered to distinguish between girls who were physically mature and those who were not. More importantly, however, had the Prophet ﷺ himself been perceived as promoting the exploitation of children, then said scholars would have simply considered the age of nine to be the only condition necessary for a young girl to be considered mature. However, the age of nine has never been mentioned as one of the conditions by which to judge maturity in the Islamic tradition. Rather, jurists derived a completely different understanding from the relationship between the Prophet ﷺ and Aisha (ra): that he had entered a contracted marriage with her when she was six years of age, and then consummated the marriage after she had reached maturity three years later. Simple logical deduction led scholars to conclude that if Islam allowed for the abuse of children, then the Prophet ﷺ would not have needed to wait three full years before finalizing his marriage—but he did wait. He waited because he knew that to do otherwise would have caused harm to his wife, and one of the principle objectives (maqasid) of Islamic law is “the prohibition of subjecting oneself to harm (darar) or causing harm to others (dirar).”[45]
A cursory review of Islamic history shows that this principle has generally been applied when deciding a number of complex legal issues, especially with regard to marriage. One case during the era of the Mamluk Sultanate of Egypt (1250–1517 CE) is particularly noteworthy. In the year 1470 CE, a woman petitioned the grand qadi (judge) of Cairo to have her 12-year-old niece married off due to financial difficulties, as the young girl had no means of support after her parents had abandoned her three years prior. The grand qadi then delegated the case to his deputy, Ibn al-Ṣayrafī, who narrated the incident in his journal. After assessing the situation, al-Ṣayrafī had the girl married off to a soldier’s servant, hoping that it would resolve her precarious circumstances. However, given that she had not yet reached puberty, he made sure to include a clause in the contract prohibiting her husband from consummating the marriage until she had adequately matured. Unfortunately, her husband violated the agreement and the couple was subsequently divorced. The girl’s aunt then complained to the chief dawadar (an assistant to the sultan), Yashbak min Mahdī. Al-Ṣayrafī was eventually called forth by min Mahdi to explain why he had allowed such a young girl to be married. His answer was simple and to the point: “Because the Prophet ﷺ married Aisha (ra) when she was nine years old.” However, the dawadar was not satisfied with his response and a few days later ordered the ex-husband to be flogged 100 times and publicly humiliated as “an example to anyone who deflowers young girls.” Interestingly, al-Ṣayrafī agreed to the punishment on account of the husband’s disregard for the boundaries set in the contract.[46] 
What this incident showcases is that not only were the qadis concerned with the well-being of immature girls but so too were higher government officials; both attempted to minimize any potential harm and punished those who inflicted harm on minors. Therefore, examining such examples (in conjunction with traditional Islamic teachings) offers a sharp contrast to the narrative that Islam supports the exploitation of children.

Conclusion

Due to the complex conditions of the contemporary period, young people not only have the option of waiting before engaging in intimate relationships, but should do so for the sake of minimizing any potential harm to their lives. When examining the marriage between the Prophet ﷺ and Aisha (ra), we not only find an example of this nuance being put into practice, but can also glean some of the Divine Wisdom for humanity—a moral code that anticipates the fluctuations of human development over time. By extension, it should be undeniable now that the Prophet Muhammad ﷺ was perfectly within his moral rights to marry and love Aisha (ra). Unfortunately, some Muslims have become ignorant of their own tradition and have succumbed to interpreting Islamic Law in an uncompromising ahistorical fashion, much the same as critics of Islam.
Likewise, Western nations have not helped to set the standard by focusing entirely on superficial age limits as determinants of maturity—all the while considering it socially acceptable for their own minors to engage in sexual relations as long as they are within the same age range. It’s difficult to take the Western ethos seriously when there is a significant demarcation between what constitutes maturity and the permissibility to have sex. You cannot, on the one hand, condemn the practice of child marriage, but at the same time think your own children are physically and emotionally mature enough to have intimate relationships. It simply doesn’t make any sense. A minor who decides to have sex is still a minor who decides to have sex, regardless if they choose an age-similar partner or not. Western culture sends mixed messages when it tells minors that they have the right to intimacy with those they are attracted to only as long as they refrain from potential partners legally recognized as adults. To think that such an arbitrary distinction would matter to a teenager with raging hormones—or be considered detrimental—is an absurdity, because a minor would face the same consequences with their peers as they would with adults (e.g., pregnancy, sexually transmitted diseases, domestic violence, exploitation, etc.). In other words, this is simply an inconsistent standard to follow. Thus, the anachronistic outrage towards the marriage of the Prophet ﷺ and Aisha (ra) appears nothing more than a vacuous display of virtue signaling born from an ignorance of science, history, morality, and Islam alike.
Much gratitude to Sh. Omar Suleiman and Prof. Jonathan Brown for their guidance on this important topic. Also thanks to Justin Parrott and Dr. Nazir Khan for their feedback and assistance in research, and Dr. Nameera Akhtar for editing this paper. May Allah bless them and all the other members of Yaqeen for making this project possible.  

Notes

[1] David Mcclendon and Aleksandra Sandstrom, “Child marriage is rare in the U.S., though this varies by state,” Pew Research Center, November 1, 2016, http://www.pewresearch.org/fact-tank/2016/11/01/child-marriage-is-rare-in-the-u-s-though-this-varies-by-state/

[2] Jill Tucker, “Effort to bar child marriage in California runs into opposition,” San Francisco Chronicle, July 7, 2017, https://www.sfchronicle.com/bayarea/article/Effort-to-bar-juvenile-marriages-in-California-11268497.php

[3] Marie Doezema, “France, Where Age of Consent Is Up for Debate,” The Atlantic, March 10, 2018, https://www.theatlantic.com/international/archive/2018/03/frances-existential-crisis-over-sexual-harassment-laws/550700/

[4] “Convention on Consent to Marriage, Minimum Age for Marriage and Registration of Marriages,” United Nations Human Rights Office of the High Commissioner, Accessed November 15, 2017, http://www.ohchr.org/EN/ProfessionalInterest/Pages/MinimumAgeForMarriage.aspx

[5] “Child marriage is a violation of human rights, but is all too common,” UNICEF Data, March 2018, https://data.unicef.org/topic/child-protection/child-marriage/#

[6] “German police uncover Darknet child pornography website with 90,000 users,” Independent, July 6, 2017, https://www.independent.co.uk/news/world/europe/germany-child-pornography-website-90000-users-darknet-frankfurt-a7827146.html

[7] Sahih al-Bukhari, Book 67, #69.

[8] This position is relatively new and has only been proposed in the past few decades. It is a rather tenuous position to take, one which primarily relies on speculations about the age of Aisha according to obscure historical accounts and other hadiths.

[9] David H. Fischer, Historians’ Fallacies: Toward a Logic of Historical Thought (New York: Harper & Row Publishers, 1970), pp. 135-140.

[10]Alberto Brandolini, Twitter Post, Jan 10, 2013, 11:29 pm, https://twitter.com/ziobrando/status/289635060758507521

[11] Department of Economic and Social Affairs Population Division, “World Population Prospects: The 2015 Revision, Key Findings and Advance Tables. Working Paper No. ESA/P/WP.241.,” United Nations, 2015, https://esa.un.org/unpd/wpp/publications/files/key_findings_wpp_2015.pdf

[12] Even when we take into account the high rates of infant mortality in ancient Rome, we still find that most adults did not live very long. Unfortunately, studies which purport that the average life expectancy of ancient people’s is similar to ours today often neglect to mention archeological evidence of a high proportion of Roman citizens’ gravestones and burial sites showing many died from disease, famine, war, labor, and natural disasters—circumstances which the contemporary world is far more prepared to handle. Likewise, to point to written records of famous historical figures living well into their 70s and beyond does nothing to support this point. It’s certainly the case that some people lived just as long as most people do today, but they were the exception and not representative of the broader population. Therefore, to suggest that ancient people had similar life expectancies to our own is simply wrong.

[13] Patrick Browne, “Why the average ancient Roman worker was dead by 30,” The Local, May 27, 2016, https://www.thelocal.it/20160527/groundbreaking-study-reveals-brutal-realities-of-life-in-ancient-rome For full study: Andrea Piccioli et. al., Bones: Orthopaedic Pathologies in Roman Imperial Age (New York: Springer, 2015).

[14] Kyle Harper, “Marriage and Family” in The Oxford Handbook of Late Antiquity, Ed. Scott Fitzgerald Johnson (New York: Oxford University Press, 2015),  p. 685.

[15] Nathan Pilkington, “Growing Up Roman: Infant Mortality and Reproductive Development,” Journal of Interdisciplinary History 44:1 (2013), p. 6.

[16] Daniel Nettle, "Flexibility in reproductive timing in human females: Integrating ultimate and proximate explanations," Philosophical Transactions of the Royal Society B: Biological Sciences, 366:1563 (2011), pp. 357-58.

[17] Lisa A. Alberici and Mary Harlow, “Age and Innocence: Female Transitions to Adulthood in Late Antiquity,” Hesperia Supplements, 41 (2007), p. 195.

[18] M.K. Hopkins, “The Age of Roman Girls at Marriage,” Population Studies, 18:3 (1965),  p. 313.

[19] Vern L. Bullough,  “Age of Consent,” Journal of Psychology & Human Sexuality, 16:2-3 (2005), pp. 29-30.

[20] John T. Fitzgerald, “Orphans In Mediterranean Antiquity and Early Christianity,” Acta Theologica, Suppl. 23 (2016), p. 33.

[21] Prior to the advent of Islam, Arab culture was based primarily on the oral transmission of information. It wasn’t until the early Islamic conquests and the resulting subsuming of external societies (e.g., Persian and Roman) that Muslims adopted writing as a standard medium of communication.

[22] Amram Tropper, “Children and Childhood in Light of the Demographics of the Jewish Family in Late Antiquity,” Journal for the Study of Judaism in the Persian, Hellenistic, and Roman Period37:3 (2006), p. 332.

[23] Mary Lewis, The Bioarchaeology of Children: Perspectives from Biological and Forensic Anthropology (New York: Cambridge University Press, 2009), p. 4.

[24] Siân Halcrow and Nancy Tayles, “The Bioarchaeological Investigation of Childhood and Social Age: Problems and Prospects,” Journal of Archaeological Method and Theory, 15:2 (2008), p. 203.

[25] Sahih al-Bukhari, Book 8, #465.

[26] Sahih al-Bukhari 3894 and Sahih Muslim 1422a.

[27] The fact that she reached puberty by age nine is further confirmed in other hadith, such as in Sunan Abi Dawud 4933.

[28] Jessa Gamble, “Puberty: Early Starters,” Nature: The International Journal of Science, October 04, 2017, https://www.nature.com/articles/550S10a

[29]Sandra K. Cesario and Lisa A. Hughes, “Precocious Puberty: A Comprehensive Review of Literature,” Journal of Obstetric, Gynecologic & Neonatal Nursing 36:3 (2007), pp. 263-274.

[30] Peter Gluckman and Mark Hanson, “Evolution, Development and Timing of Puberty,” Trends in Endocrinology and Metabolism, 17:1 (2006), p. 10.

[31] Sahih al-Bukhari 6130 with commentary from Fath al-BariVol. 13, p. 143.

[32] Ibn Hajar al-Asqalani, Fath al-Bari, vol.13 (n.d.), p. 143.

[33] “The prohibition of pictorial and figural representations is confirmed from the Messenger of Allah ﷺ from many sources. It is likely that what is accepted in the narration of Abu Salamah from Aisha (ra) preceded the expedition of Khaybar and that was before the forbiddance of images and representations, then their forbiddance was after that.” – Ahmad Ibn Husayn Bayhaqi,  Al-Sunan Al-KubraV. 10, Ed. Muhammad ‘Abd al-Qadr Ata (Beirut: Dar al-Kutub al-Ilmiyah, 2003), p. 371.

[34] Laurie Wilkie, "Not Merely Child's Play: Creating a Historical Archaeology of Children and Childhood," in Children and Material Culture, Ed. Joanna Sofaer Derevenski (New York: Routledge, 2000), p. 102.

[35] Azhar Wan Ahmad, Public Interests (Al-Masalih al-Mursalah) in Islamic Jurisprudence: An Analysis of the Concept in the Shafi’i School (Kuala Lumpur: International Institute of Islamic Thought and Civilization, 2003), pp. 27-28.

[36] Ibid, pp. 5-6.

[37] Mohammad Kamali, Principles of Islamic jurisprudence (Kuala Lumpur: Islamic Texts Society, 1991), p. 45.

[38] Ibn Kathir, Tafsir al-Qur’an al-Azim — commenting on Al-Qur’an 4:6.

[39] “The jurists who insist on guardianship in marriage seem to consider it to be a duty rather than a right of the guardian, or at least a synthesis of both. While the guardian has the right to conclude a marriage on his ward’s behalf and to give consent or object to her unwise choice, it is his duty to exercise this right in her best interests and he is enjoined to take her wishes into consideration.”– Quoted in Mahdi Zahraa, “The Legal capacity of women in Islamic law,” Arab Law Quarterly11:3 (1996),  p. 260.

[40] Lynn Welchman, Beyond the Code: Muslim Family Law and the Shari‘ Judiciary in the Palestinian West Bank (The Hague: Kluwer Law International, 2000), pp. 108-109.

[41] Judith Tucker, “Muftis and Matrimony: Islamic Law and Gender in Ottoman Syria and Palestine,” Islamic Law and Society, 1:3 (1994),  p. 271.

[42] The idea that the majority of our ancestors suffered from a mental disorder that made them attracted to children is a ludicrous notion unsubstantiated by any academic research.

[43] This inference is made in conjunction with verse 33:49 in the Qur’an which stipulates that an unconsummated marriage does not require a waiting period in the case of divorce.

[44] Imam ibn al-Humam, Fath al-Qadir, Vol. 4 (n.d.),  p. 280.

[45] Al-Shatibi, Muwafaqat, vol. 3/3, pp. 14-15.

[46] Yossef Rapoport, “Royal Justice and Religious Law: Siyasah and Shari’ah under the Mamluks,” Mamluk Studies Review, 16 (2012),  pp. 89–92.