Blog

Global Currents article

The Future of Islamic Authority: New Recipes for Milk Shaykhs

Chief Justice John G. Roberts Jr. administers the oath of office a second time with Barack Obama in the Map Room of the White House on Wednesday, January 21, 2009. Photo Credit: Peter Souza.

When president Obama took the oath to office, he unintentionally uttered a couple of the words in the wrong order. Although the meaning of the oath did not change, sticklers wondered whether the oath was legitimate and if Obama was really president. Doubts like this can linger and spiral out of control into conspiracies. So the matter was corrected in a privately conducted do-over ceremony the very next day. The authority of the Constitution was paramount, and every letter mattered. Rituals are tricky business in any culture, and Islam is no exception. Should the prayer be repeated if the person praying wasn’t facing Mecca? Is lab-grown meat halal to eat? Is that financial transaction valid? Can women serve as Imams of mixed-gender congregations? Who gets to decide?

The Sharia is complex, and that complexity has bred authority to help resolve, mediate, and manage it. A class of jurist-scholars (sg: ʿālim; pl: ʿulamāʾ) emerged to claim the authority of interpreting God’s will in Islamic civilizations. Ulama were authorized within communities of learning and knowledge transmission. They legitimized rulers as valid if they regulated affairs according to the Sharia. Rulers in turn authorized ulama by employing them to administer law and dispense justice. This structure of authority has been in crisis since colonialism and the rise of nation states.

The displacement of traditional authority has been traumatizing for Muslim societies, and the ʿulamāʾ have reacted in diverse ways. One response has come to be labeled Islamism: the wholesale attempt to take control of state authority and institutions. Another response manifests in retrenchment: reflexive cultural criticism of anything new and unfamiliar from behind the walls of the madrasa or from the elevation of the pulpit. Between these extremes are alternatives that seek either new forms of alignment between state institutions and traditional scholars (accommodation) or new paradigms of knowledge and authority (reconstruction).

13th century illustration depicting Muslim scholars in a public library in Baghdad, from the Maqamat Hariri. Bibliotheque Nationale de France.{{PD-US}}

The temptation to simplify might lead the analyst to see clear distinctions between these camps. But this temptation must be resisted, since the lines between retrenchment, accommodation, and reconstruction are in fact blurred. A single person may be accommodation-dominant, while exhibiting tendencies of retrenchment here, reconstruction there. And what believer in the Sharia is not an Islamist in some sense? After all, are the laws of God not designed for the common good? Should not divine wisdom have at least some relevance to public policy? The crisis of traditional authority is one of interpretation; it is not a difference of opinion on the imperative to submit to God’s will.

God’s will in the world is related both to private and public matters. The Sharia regulates not only prayer and charity, but also crime and punishment. And since the debates concerning interpretation have a direct bearing on policy and legislation, the issue of religious authority is inextricably linked to the concerns of those who wield political power. Muslim majority societies—whether they are despotic or democratic, Sunni or Shiite—and Muslims living as minorities all over the world have found their own ways to balance competing interests in the wake of modern dislocations. The space is wide open for creativity, whether genius or absurd.

An example of the latter is a notorious fatwa (legal opinion) emanating from a scholar of the famed Azhar University to deal with the problem of mixed-gendered workplaces. According to classical Islamic law, breastfeeding creates bonds of consanguinity and maternity. Drawing on this example, argued the scholar, men who drink from the milk of their coworkers would no longer have to observe the strictures of segregation in the workplace, becoming, as it were, a kind of household. In fairness, the opinion was retracted after severe criticism from within the very scholarly guild that authorized the scholar who issued the fatwa. But the damage was done. As Khaled Abou El Fadl points out, “One would be tempted to ignore this fatwa as a gross aberration or outlier if it were not part of a trend in contemporary Islamic discourse” (47).

This kind of “Milk Shaykh,” as I jokingly referred to him once, elicits laughter and derision in the classroom. But the point is a serious one: the erroneous Azhar scholar was attempting to respond to a particular crisis using classical modes of thought. In this case, the attempt was patently absurd and rightfully dismissed by his own traditional colleagues. Other attempts can be more sophisticated, but in their sophistication only serve to mask underlying problems that are structural in nature: in many instances, the methodology used to derive legal rulings no longer yields acceptable or intelligible outcomes.

Participants in Contending Modernities’ Madrasa Discourses Program discuss contemporary issues in Islamic Studies.

Academic institutions, particularly in North America and Europe, are beginning to provide a counterweight to madrasa scholars who wield authority in the name of the classical tradition. In bringing the tools of higher criticism to bear on Islamic history, academic scholars argue that traditional scholars do not notice the way that their supposedly objective interpretations of scripture reveal their subjective biases. According to contemporary convention, there is no such thing as objective knowledge. Knowledge is perspectival, history is contingent, and religious thought merely reflects the spirit of the age. This is not to say that all knowledge is relative, that there is no objective reality, or that things are unknowable; rather, it is to say that all knowledge is conditioned by subjective experiences; nobody holds the keys to objective reality, and things are knowable only from certain perspectives.

Ayesha Chaudhry uses the term “idealized cosmology” to express how preconceived notions of reality, particularly when it comes to gender hierarchy, are brought to bear on religious texts. Chaudhry is influenced by scholars like Kecia Ali who argue that the laws of Islam, however brilliant and coherent, cannot simply be tweaked to serve our purposes for today. Muslims need to construct a new legal system from the ground up using egalitarianism rather than hierarchy as a founding principle. Rumee Ahmed reads the history of Islamic law as a series of “patches” and “hacks” in an ongoing effort by classical scholars to keep it relevant for the times in which they live. His work makes plain what the classical tradition obscures: the law is human and subjective, and any attempt to make it appear otherwise is disingenuous. Ahmed, in a creative move of his own, suggests that the law can in fact be purposefully hacked by an anonymous team of activists working in amorphous online networks to develop legal formulas for our time to get us where we want to go. In this forumulation, anyone can be a milk shaykh.

What do these kinds of criticisms mean for religious authority in Islamic societies going forward? Lines have been drawn. Traditional scholars are caught in the old world, ensnared between revolution, irrelevance, and the unwitting support of despotism (see here and here). The dust has yet to settle from, as Wael Hallaq puts it, “the epistemic havoc wrought by modernity” (542). But if history is our guide, then despair not, for traditions are animated by crises. In a fresh and refreshing take on Muslim theology in response to the present crisis, Martin Nguyen argues that crises are a normal feature of religious life. Was not the “fall” of Adam & Eve a crisis? Does not every prophet respond to some kind of crisis, while generating fresh ones of his own? Are we each not preparing for the ultimate crises of death, resurrection, and accountability in the hereafter? In Islamic history, consider the succession of Muhammad, the civil wars in the days of the Companions, the Umayyad and Abbasid revolutions, the challenge posed by freethinkers and philosophers, the inquisition of Ma’mun, the occultation of the twelfth Imam, the reduction of caliphs to figureheads, the Crusades, the Mongols…

Religions are made for crises, and each crisis presents unique challenges. The uniqueness of the present crisis stems from its potential to demystify the past and reset the foundations of our engagement with God’s revelation. “The feminist edge of Qur’anic interpretation,” argues Aysha Hidayatullah, “is the site of dynamic challenges to the boundaries of Islamic tradition” (3). At these edges lie the greatest potentials for creative renewal. The conversation on authority tends to retreat to simplistic either/or categories: either the accommodationist tradition or the new wave of academic criticism toward reconstruction. If the vitriol on social media between these two camps is any indicator, the future may in fact be charted through such exclusivist binaries. Exclusive camps have always been there, and they always will be. Things are most interesting, however, at edges, intersections, and boundaries. That is where complexity resides, and that is where the kind of enduring authority that will command the middle can best plant its flag.

Mahan Mirza
Mahan Mirza was appointed teaching professor and executive director for the Keough School's Rafat and Zoreen Ansari Institute for Global Engagement with Religion on July 1, 2019.

An Islamic studies scholar and expert on religious literacy, Mirza brings extensive pedagogical and administrative experience to his roles at Notre Dame, including serving as dean of faculty at Zaytuna College in Berkeley, California, America’s first accredited Muslim liberal arts college. Prior to his appointment as Executive Director of the Ansari Institute, Mirza served as the lead faculty member for Notre Dame's  Madrasa Discourses project, which equips Islamic religious leaders in India and Pakistan with the tools to confidently engage with pluralism, modern science, and new philosophies. 

Mirza holds a B.S. in mechanical engineering from the University of Texas at Austin, an M.A. from Hartford Seminary, and a Ph.D. in religious studies from Yale University. He has taught courses and lectured on Arabic-Islamic studies, western religions, and the history of science, along with foundational subjects in the liberal arts, including logic, rhetoric, astronomy, ethics, and politics. He has edited two special issues of 
The Muslim World
and served as assistant editor for the Princeton Encyclopedia of Islamic Political Thought (2018).

He is a fellow with the Liu Institute for Asia and Asian Studies at the Keough School and continues to serve as an advisor for Madrasa Discourses.
 
Field Notes article

Without Literacy, There is No Wisdom: Reflections on Madrasa Discourses’ Summer Intensive in India

Madrasa Discourses faculty and students. New Delhi India, July 2019

At the end of its four-day long intensive program (July 23–26, 2019) conducted in New Delhi, India, Madrasa Discourses (MD) bid the first batch of its Indian participants farewell. After three years together, the very thought of a farewell was overwhelming for both the faculty and participants; overcome with strong emotions, some of them even choked up and struggled to deliver their closing comments. Nevertheless, most of them were able to express the significance of the program for the madrasa community of India. The following three points can be gleaned from their comments and impressions of the program: (1) the program taught them the value of learning and disseminating knowledge for the common good of society; (2) it trained them to interpret the Islamic tradition in the light of modern knowledge and changed circumstances—a requisite intellectual tool, indeed, to keep any tradition alive; and, (3) though the event marked their departure from the program, it did not mark the end of their intellectual journeys. The emotions and the aspirations of the participants illustrated that they had acquired a sense of responsibility towards their community, a community which is going through, in the eyes of many, an “intellectual crisis.” MD can be considered an initiative to redress this crisis.

This essay outlines the discussions among those ʿulamāʾ invited to the symposium held on the first day of the intensive program at Jamia Millia Islamia University, New Delhi, on July 23, 2019. It raises concerns about some of the approaches taken by the scholars who attended and points to alternative ways of approaching Islamic thought in the modern world. It should be noted that these are the author’s personal reflections and should not be attributed to MD or any of its faculty.

Ebrahim Moosa speaking at morning session of public program. Jamia Millia Islamia University, New Delhi, India. July 23, 2019

This intensive was the sixth held by MD during the last three years, and held a significance unlike previous intensives. Until now, MD had not been officially introduced to the circles of traditional ʿulamāʾ and the Muslim intellectual community of India. The first day of the program was designed to facilitate an open discussion on challenges that Islamic thought and the madrasa community have encountered. Prominent ʿulamāʾ and scholars were invited from across the country to take part in the discussion. The objective was to address the contemporary intellectual challenges to Islamic thought, and to inform the attendees that MD, with the help of the madrasa community, is committed to answering some of these challenges. The program was divided into three sessions: the first was devoted to invited lectures, and the remaining two sessions were dedicated to a discussion of the tensions between modern science, speculative theology (‘ilm al-kalām), and questions of Islamic jurisprudence (fiqh).

In the first public session, Mawlana Khalid Saifullah Rahmani, an eminent Deobandi scholar and head of the Islamic Fiqh Academy of India, delivered the keynote address. While one might have expected him to express his views on the contemporary theological challenges Muslims faced, Rahmani instead highlighted the historical phases which necessitated the emergence and evolution of early Muslim theology (‘ilm al-kalām). His understanding of ‘ilm al-kalām is fashioned after the common position which holds that the discipline is an intellectual exercise meant to defend Islamic creeds. Rahmani drew upon the historical experiences of Muslim scholastics and theologians and emphasized the need for a new-‘ilm al-kalām, which should be more comprehensive and able to address the modern challenges posed by philosophical and scientific questions. However, he made it clear that the creeds explicitly mentioned in the text are beyond the scope of any intellectual debate. Rahmani also pointed out that the entire battle between science/reason vs. religion uniquely stemmed from Christendom’s experience which turned into a battle of science/reason against religion. It then developed into a revolt against the authority of the Catholic Church followed by the persecution of philosophers and scientists. He suggested that today modern philosophy and science are composed of anti-religious elements thanks to the early opposition to both by the Catholic Church. Islam, on the other hand, had always appreciated the use of reason and thus can constructively engage with modern scientific challenges.

Rahmani’s reference to the historical development of modern science and philosophy forges a story that seeks the culprit in someone else’s camp. It holds the Church as the arch-culprit for fostering enmity between science and religion. What the purpose of such an account was intended to serve was not entirely clear. Perhaps it was intended to defend Islam and Muslims in terms of their historical encounter with science. If this is the case, the opposition of religion to philosophical exercises is not an unparalleled historical experience only found in the medieval Christian world. There are numerous examples that one can cite of anathematization of great Muslim philosophers and mutakallimūn by Muslim clergy that in some instances was followed by their persecution in the medieval Islamic world. It would be better if Rahmani engaged with the historical tensions and complexities that existed between Islamic orthodoxy and philosophy and kalām. It also might have been helpful if Rahmani demonstrated how Islamic thought dealt with contemporary science and philosophy.

Interestingly, the participants in the closed sessions concurred with Rahmani that contemporary Muslims needed a new-‘ilm al-kalām. Mawlana Mahmood Ahmad Ghazi, acting as the moderator, began the second closed session by asking whether the Islamic creeds can neatly categorize Islamic creedal doctrines into categories of debatable and non-debatable. While addressing the question, Mawlana Zishan Ahmad Misbahi stated that there are certain creeds, like the oneness of God, that are mentioned in the Qur’an, and therefore need to be left untouched. However, Misbahi claimed, those which were debated in the early history of Islam can also be debated today. Misbahi’s classification of some aspects of the creeds as fundamental and others as peripheral were consonant with Rahmani’s views, which, I think, represent the general approach taken in traditional thinking. Had the discussion continued along the same line, it could have yielded more valuable insights. However, the conversation veered in the direction of an apologetic theology, one which seeks to offer Qur’anic justifications for existing scientific beliefs. Such an approach superficially eradicates any contradictions between science and theology, but really engages neither science nor theological reasoning on their own terms. One critique that could be leveled at scholars who utilize this approach is that they entirely avoid addressing the underlying philosophical complexities—the tensions between medieval and modern presumptions about religion and science—and hence cherry-pick from a variety of available theories. Furthermore, this method of resolving contradictions between science and religion frequently fails to use a methodological framework based on sound philosophical presumptions, and rather uses an apologetic style of scriptural reasoning.

For example, one of the participants in the discussion reiterated a famous argument that the domains of religion and science are different. According to this participant, religion focuses on generating values on the basis of revelation and is thus redemptive in nature, while science concentrates mainly on developing a lifestyle that is compatible with technological advancement. This participant claimed the latter thus has nothing to do with values. Though the argument seemed convincing to some in the audience, placing science and religion in two separate domains is problematic because human beings do not live their lives in ways that mirror such bifurcation. We live in a new world created by modern science and adapted to it. Treating religion as something unrelated to science is thus a grave mistake and causes contradictions in actions and beliefs.

Afternoon Session of Madrasa Discourses Program. Jamia Millia Islamia, New Delhi, India. July 23, 2019

For most of the participants, the word “science” only referred to material science. An unidentified participant suggested that one should also consider the social sciences in a discussion of science. A professor of physics claimed that the social sciences were entirely based on ignorance, or jihālat. An observer was unsure if he was being sarcastic or serious. It is interesting to note that the term jihālat or jāhili connotes, especially for Islamists, a system—social, political, or knowledge—which is essentially anti or un-Islamic.

In the last afternoon session, the discussion turned to Islamic jurisprudence. I recall a scholar trained in religious education (‘ālim pl.ʿulamāʾ) arguing that Islam prohibited slavery a very long time ago. And those Muslims who practiced it, he claimed, were actually violating the rules of Islam. Additionally, he also explained the “wisdom” embodied in one of the most famous verses of the Qur’an 4:34, which appoints men to be in charge of women. The verse states that if a wife disobeys her husband, he is allowed to strike her. The verse has been a bone of contention among feminists, modernists, and traditionalists. The ‘ālim explained that the verse allows a husband to strike his wife only if she indulges in fornication. Instantly, he was asked should a wife be allowed to strike her husband if he does the same. He replied, without taking a pause, that it would be unnatural for her to do so. To me, the overall view of the ‘ālim sounded rather odd. For example, I later discovered that his view was not supported by Islamic law which reserves the right to administer punishment exclusively for the government—there is no permission for the husband to strike a wife who engages in fornication in Islamic law.

Such explanations assume that every good which can possibly exist has already been explained in the Qur’an. On this approach, where in Western thought human rights which ground prohibitions on slavery and give equality to women are a modern development, in Islam they are much older since they originated with the revelation of the Qur’an; Islam had allocated these rights using an alternative framework to human rights fourteen centuries ago. This type of reading of the Qur’an worries me because it imposes its own modern understanding of the world onto past periods in Islamic history while ignoring the  different contexts that shaped how Muslims understood and interpreted the Qur’an. For instance, the modern discourse of human rights entitles a person to certain rights, for example, freedom, equality, etc., on the basis of his/her being a human being. Hence, the word “right” in this context would be interpreted as entitlement. A violation of any human right by an individual or a state would be an affront to a human person. A comparison between the modern western conception of “right” with the medieval Islamic understanding of the word, ḥaqq (wrongly translated as a “right” like a human right) would reveal that the Muslim jurists conceived the word ḥaqq not as “entitlement” but as “duty” towards his/her fellow citizens and the state, which is contradictory to modern rights discourses (Moosa, 2000–2001). This example shows the problems that appear when one equates modern rights discourses with traditional Islamic thought. I could produce several more examples which prove a conceptual departure of Muslims from the medieval to the modern world, however, the scope of this essay does not allow it. The point is that Muslim scholars should accept this departure, show courage to shoulder the accountability of the history they inherit, and be ready to reinterpret parts of their tradition which are inconsistent with their modern experiences. Though it is easy to reject the past or confuse the modern with something already in the tradition, doing so intellectually paralyzes Islamic thought.

Mohammad Ali
Mohammad Ali is a PhD Fellow at the Department of Islamic Studies at Jamia Millia Islamia in New Delhi, India. He is also a graduate of the Madrasa Discourses program.
Global Currents article

Weaponizing Antisemitism under the Guise of Care

A U.S. joint forces color guard displays the United States of America and Israel flags during a visit from Lt. Gen. Benjamin Gantz, IDF Chief of General Forces. 18th Chairman of the Joint Chiefs of Staff Gen. Martin E. Dempsey welcomes Israel Defense Forces Chief of General Forces Lt. Gen. Benjamin Gantz to the Pentagon, Jan. 8, 2014. DoD photo by Army Staff Sgt. Sean K. Harp/Released.*

The forests of the Amazon are burning. The lungs of the planet will collapse and take the rest of us with them. The United States government is detaining large numbers of immigrants, separating children from their families, deporting parents who work in this country, and planning to lift the limit on how long people can be detained. Americans are dying because they cannot afford their prescription medications, and mass shootings are on the rise while the government does nothing about gun control. But the current resident of the White House made some outrageous remark. Another outrageous remark, just one of several on any given day. And so, all the major news outlets shifted their focus from the enormity of climate change, the precarity of the human race, and our cruelty to each other. This time, the comment was about Jews in America. It was a troubling remark and one that recalls a long history of antisemitism. This kind of talk is dangerous. So, we are swept up by the news cycle and shift our attention to the agenda set by racism—again.

The current resident of the White House, a symptom of larger processes, proclaimed that Jewish Americans who do not vote for the Republican party are disloyal. Not disloyal to their country of citizenship. No, they are traitors to the state of Israel. By the time this post is published many readers will already have seen countless responses to the antisemitism inherent in the idea of Jews as disloyal. So, rather than focus on that specific comment, I want to point to a broader process underway: a shift in antisemitic rhetoric that might make it harder to recognize and therefore all the more dangerous.

Rep. Ilhan Omar speaking at a Hillary for Minnesota event at the University of Minnesota. Photo Credit: Lorie Shaull.

In addition to his remarks about disloyal Jews, the same man has also denigrated three junior members of Congress by weaponizing the accusation of antisemitism. He says that Rashida Tlaib, Ilhan Omar, and Alexandria Ocasio-Cortez are antisemitic. These three junior congresswomen, women of color, include one Muslim woman of Palestinian descent, one Somali Muslim woman in hijab, and one Latina. They are outspoken in their opposition to Israeli policy. They call for the end of Israel’s military occupation of Palestinian territories and for human rights for Palestinians. The man in the White House calls them antisemites. He says that they hate Israel and that they also hate all Jews.

These accusations—one of disloyalty and the other of antisemitism—are themselves forms of antisemitism. This weaponized antisemitism is dangerous for all minorities and people of color. It calls out migrants, and Muslims in particular. It also participates in a much longer historical process of making Jews and Muslims enemies of Europe and of each other, while conflating Muslim with Arab in ways that continue to have serious consequences in Israel/Palestine. But it is especially dangerous for Jews. American Jews have long felt themselves relatively comfortable in their country, and relatively safe from antisemitic rhetoric or acts of violence. The current focus on Jews and the questioning of their loyalty reminds Jews of a long history of similar accusations.

The claim of disloyalty resonates profoundly with older antisemitic tropes. The Jews were accused of disloyalty in medieval Christian times, during the Inquisition, and, more recently, in pre-WWII Europe. Throughout history, Jews have been accused of disloyalty to the crown, the country, the Pope, or to Christ. But how can American Jews be accused of disloyalty to a foreign country? The idea is bizarre unless one presumes that all Jews should support Israel first and at all times because of some kind of tribal, ethnic, or religious affiliation. Should American Catholics be loyal to Rome? Should American Muslims support the governments of Pakistan or Saudi Arabia?

Beyond the absurdity lies the age-old accusation leveled against Jews everywhere: they are disloyal to their country of residence because they are Jewish. In part, this idea results from an inherent problem in the modern identity categories through which we generally navigate our social lives. These categories suggest that national belonging and religious affiliation are separate aspects of identity. A person can be French and Catholic, or Dutch and Buddhist at the same time. Scholars such as Charles Taylor contend that not only are these categories separable, but that in our modern, supposedly secular age, “religion” is a matter of personal choice. That means a French citizen can choose to be Catholic or Protestant or join some other faith group. And, a person can even choose not to be affiliated with any religion. The idea that such distinctions can be made is often considered a specifically Protestant understanding of the term “religion.” Whether or not one agrees with that analysis, it seem clear that such is not the case for the Jews. In the figure of the Jew, the categories of national and religious belonging are at once both separate and conflated. While a Jewish person might be non-observant, or practice another faith, the racialization of Jewishness has meant that no matter what they do, a Jew will always be Jewish. Importantly, Jewishness is understood as a national and religious category. Thus, regardless of their country of residence or their citizenship state, Jews are always and forever foreigners who cannot be trusted as patriotic citizens. They might be granted citizen’s rights, might completely assimilate to the local culture, and might even convert to Christianity. However, they will always be suspect.

Many Peoples One Nation: Let us Unite to Americanize America. Printed and published by Ray Greenleaf, 1917.

Hannah Arendt discussed this conundrum in a piece called “We Refugees.” In that 1943 article, Arendt wrote about the uselessness of attempts at assimilation. She spoke of a character she called Mr. Cohen who, when he lived in Germany was 150% German. When forced to leave Germany, he moved to Prague and became 150% Czech, and then 150% French and so on interminably attempting to demonstrate that he was forever everywhere and anywhere a loyal citizen; anything but a Jew.

Now, in the United States of 2019, we hear people opposing the current Republican administration by quoting another Republican, Ronald Reagan who once fondly quoted a letter he received that said, “You can go to Japan to live, but you cannot become Japanese. You can go to France to live and not become a Frenchman. You can go to live in Germany or Turkey, and you won’t become a German or a Turk.… Anybody from any corner of the world can come to America to live and become an American.” It is impossible to repeat this quote without pointing out its erasure of the settler colonial past of the United States, its ongoing genocide of Native Americans, and the fact that the U.S. is a society that was built on the backs of slave labor, that is, people who did not just “come to live in America.” Beyond that, it seems clear that if Jewish Americans are called upon by their government to be loyal to the state of Israel, this picture of becoming American is nothing more than a myth, no different from Arendt’s assessment of the impossibility of assimilation.

But what does it mean to demand “loyalty” in the first place? Loyalty to whom? For what? Is loyalty the same as patriotism? Is it central to democratic living? Should we encourage unquestioning support of state policies? Or would we do better to encourage engagement and critique? The danger underlying this call to loyalty lies in its promotion of an ethno-nationalism that undermines the possibility of equal rights for all citizens in a multicultural society and the impossibility of achieving such rights based on our common humanity. Instead it reinforces the exclusionary nature of nationalism with all its precarity for those deemed outsiders.

This time, though, the accusation of disloyalty came wrapped in a kind of philo-Semitism. “We care about you,” it said. “We, the Republican party, are staunch supporters of Israel, and therefore, by definition, we are good for you. Good for the Jews. And you—Jewish Americans—would be doing the right thing if you supported our party.” The Republican party, or its leader in the White House, is now proclaiming who is a good Jew and who is not. When powerful gentiles begin deciding who is a good Jew, we should all be worried. The leader of the Republican party not only proclaims what counts as “good” Jewish behavior. He has gone one step further and determined that he, and not the members of the Jewish community themselves, knows what is good for them.

One of the things that is “good for Jews,” he says, is unconditional support for the state of Israel. Therefore, the three junior congresswomen are not good for the Jews. They are enemies of the Jewish people. These proclamations are all intended to garner support for his presidency, to appeal to his constituency, including Evangelical Christians (Christian Zionists), and maybe even to reach out to Jewish Americans. But they do something else, too, something more insidious. These comments say that all Jews are, or should be, of one mind. They should all share the same political opinions. In particular, they should all unconditionally grant their loyalty to a foreign country and oppose their own fellow citizens—in this case the representatives for which some American Jews voted—who dare to criticize that foreign country’s policies. In other words, because of who they are, Jews cannot and should not be allowed to identify primarily with their fellow Americans. They must be marked, themselves, as foreigners; their racial and national identity must be conflated again, still.

Close the Camps Rally, San Francisco, California. August 23, 2019. Photo Credit: Peg Hunter.

This notion of “all Jews,” of course, presumes a homogeneity that is very far from the truth. No religious or cultural group is homogenous, certainly not the Jews. To suggest otherwise encourages stereotypical ideas and is fundamentally racist. This notion of Jews as permanent foreigners, and inherently suspect, feeds into larger tropes of global Jewish conspiracies, of their worldwide power, and of their fealty first and foremost to eachother. Suggesting that Jews are disloyal to the modern state of Israel is a form of antisemitism, thinly veiled in terms of caring about Jews. This notion, together with accusations of antisemitism aimed at people of color and those who are critical of Israeli policy, combine to form a toxic antisemitism that stigmatizes some people while producing enemies of groups who might otherwise find shared interests.

When Republicans tell Jewish Americans, “we care about you, we know what is good for you,” they are not only infantilizing the entire Jewish population. They are repeating the two-millennium old tale of Christians having replaced Jews as the chosen people of G-d, the focus of His covenant with humankind. When they say, “we are looking out for you by pointing out antisemitism among others,” they are weaponizing antisemitism, promoting divisions, fear and hatred, that in the end are dangerous to the Jews. Thus the historical Jewish Question, which many expected would end with the establishment of the state of Israel, instead re-emerges and is transformed.We hear in this rhetoric of care echoes of the idea that Christians or Gentiles know better what is “good for the Jews” than do Jews themselves, as the promises and protections of citizenship are, once more, undermined by a toxic ethno-nationalism based in an idea of immutable difference.

 

*Department of Defense endorsement of the claims made in this essay is not intended in posting this image

Joyce Dalsheim
Joyce Dalsheim is a cultural anthropologist and Associate Professor of Global Studies at UNC Charlotte. She is author of Unsettling Gaza: Secular Liberalism, Radical Religion and the Israeli Settlement Project (2011), Producing Spoilers: Peacemaking and the Production of Enmity in a Secular Age (2014), and Israel Has a Jewish Problem: Self-Determination as Self-Elimination (October 2019), all published by Oxford University Press.
Global Currents article

On Disloyalty and Dual Loyalty: Is President Trump a Brandeisean Zionist?

Alt-right counter protestors holding Israeli, Trump, and American flags and ‘Stop Holocaust Exploitation’ posters at the ‘Close the Camps’ rally at Farmington Holocaust Memorial Center, Michigan. August 20, 2019

President Trump’s comment last week that American Jews who vote for the Democratic Party are being “greatly disloyal” has sent shock waves through the Jewish world. What did he mean? Disloyal to whom? Isn’t the accusation of disloyalty an anti-Semitic dog whistle? By not initially defining the object of “disloyalty” Trump opened up various possibilities of how his comment can, or should, be parsed. Groups on the Jewish left such as Jewish Voice for Peace and J-Street to the center-right AIPAC contested this statement. President Rivlin in Israel contested this statement. The Likud-led government said nothing. The overt partisan overtones of the comment resulted in Jewish Trumpists defending the president, saying that, even if the comment may have been poorly articulated, American Jews who vote for “the party of Rashida Tlaib and Ilhan Omar” (this is how Trumpists are reconstructing “the Democrats”) are being disloyal to Israel and, given the current fusion of pro-Israelness and allegiance to the Jews, are being disloyal to the Jewish people.

Critics of the comment claim that the language of disloyalty is anti-Semitic. The problem is that in this case the “dual loyalty” equation implied in Trump’s “disloyalty” locution seems to be inverted. The anti-Semitic trope of dual loyalty has historically been used to claim that the Jews are being disloyal to their country of residence in favor of their loyalty to the Jewish people. Later, the question of loyalty extended to the state of Israel. In the late 18thcentury, the question of Jewish loyalty was posed to the Jewish sages in France in the wake of their emancipation. Could the French Jews, who ostensibly had loyalty to another collective (the Jewish people), ever be fully loyal to France? The French sages responded that the Jews are a people of a religious tradition and not a nation and thus fidelity to the Jewish people does not stand in contradiction to allegiance to the French nation. On this basis they were emancipated.

The Trial of Dreyfus. Vanity Fair, November 23, 1899

This precarious equation lasted until it exploded with the Dreyfus affair in 1894 when French officer Alfred Dreyfus was accused and convicted of treason. (The conviction was later overturned.) A young Viennese journalist named Theodore Herzl covered the story and heard the reactions of many French men and women who were convinced of Dreyfus’s guilt. These men and women claimed that Dreyfus, as a Jew, could never give full allegiance to France. In some way, this gave birth to Zionism, which claimed in part that even the fully assimilated Dreyfus’s of the world will never be accepted as full citizens in their country of residence. That is, Jews could never fully shed the suspicion of dual loyalty.

While “dual loyalty” is indeed often an anti-Semitic canard, it is also a real challenge to all of Diaspora Jewry, especially after the establishment of the state of Israel. What is meant, for example, by the 1990s bumper sticker, “I Love New York but Jerusalem is My Home”? What if a passerby said to the driver of a car with such a sticker, “Go home”? Would that be anti-Semitic? In a way, yes it might be, but what about the bumper sticker itself, what is it trying to convey? Today, what does “home” refer to if not the capital city of another country? My point is that if Jews reflexively claim that the accusation of “dual loyalty” is anti-Semitic, we too easily ignore that it was, and remains, one of the great challenges of Jews in modernity.

Here is an example: I have friends—Modern Orthodox Jews—whose son was considering enlisting in the U.S. Navy. Almost everyone he shared his plans with at a Shabbat Kiddush said to him, “If you are going to serve in the army, why not join the IDF?” His response was, “because I am an American.” But what was behind the response of these American Orthodox Jews to this young man? “If you are going to risk your life,” they may have been thinking, “why do it for the U.S.? Why not do it for Israel?” Is this an instance of dual loyalty? If we deflect all accusations of dual loyalty, we miss the ways we practice it all the time.

Trump’s comment on Jews’ “great disloyalty” wasn’t accusing Jews of dual loyalty; in fact, he was suggesting Jews are not exercising dual loyalty enough! Can we thus say that this isn’t anti-Semitic at all? Yes. And no. In the wake of Trump’s comment, at a Close the Camps rally of liberal American Jews held outside Farmington’s Holocaust Memorial Center near Detroit, a white supremacist counter-demonstration showed white nationalists waving an American and an Israeli flag. White supremacists, whose worldview emerges from the KKK and other like-minded groups, are waving the Israeli flag to protest against liberal American Jews. Is there a connection between Trump’s comment and this phenomena?

One way to understand this is to look back at the American Zionism of Supreme Court justice Louis Brandeis. In the teens of the 20th century when most American Jews were at best ambivalent about Zionism, precisely because it exacerbated the anxiety of dual loyalty, Brandeis stood as a proud Zionist. But Brandeis deeply understood the challenge Zionism posed to a Jewry desperately trying to become “American.”

In a lecture “The Rebirth of the Jewish Nation,” Brandeis said the following:

My approach to Zionism was through Americanism. In time, practical experience and observation convinced me that Jews were by reason of their traditions and their character peculiarly fitted for the attainment of American ideals. Gradually, it became clear to me that to be good Americans, we must be better Jews, and to be better Jews, we must become Zionists.

In another essay “The Jewish Problem: How to Solve It,” presented to the Conference of Council of Reform Rabbis in 1915,  Brandeis wrote:

Indeed, loyalty to America demands rather that each American Jew become a Zionist. For only through the ennobling effect of its strivings can we develop the best that is in us and give to this country the full benefit of our great inheritance. The Jewish spirit, so long preserved, the character developed by so many centuries of sacrifice, should be preserved and developed further, so that in America as elsewhere the sons of the race may in future live lives and do deeds worthy of their ancestors.

Israeli and American Flags fluttering on mast. Photo Credit: James Emery

The fusion of Americanism and Zionism for Brandeis was his way of allaying the fears of dual loyalty among many progressive era American Jews. While I am quite certain Trump does not know of Brandeis’s Zionism, and Brandeis certainly did not intend his Zionism to be blind allegiance to a Jewish state (his Zionism was not even promoting a Jewish state), I think Trump is inadvertently using Brandeis’s logic to make two points. First, on Trump’s reading, American Jews are being disloyal to their people if they do not give full allegiance to the state of Israel and its government (Brandeis, of course, lived long before the state of Israel). And second, that such disloyalty is also disloyalty to America since, as Brandeis said, “to be good Americans, we must be better Jews, and to be better Jews, we must become Zionists.”

So when those white supremacists waved an Israeli flag to protest against an American Jewish protest against detention centers in a Detroit suburb, the flag wasn’t about being pro-Jewish (one could reasonably assume some hold anti-Semitic views) or even pro-Israel in any conventional way. It was pro-American. Thus American Jews who support “the party of Tlaib and Omar” are not only disloyal to their people. They are also disloyal to America. Thus Trump’s inversion of the “dual loyalty” equation does not erase its anti-Semitic connotations; it merely recalibrates its implications. Watching white supremacists waving an Israeli flag, as jarring as it may look, is therefore not dissonant at all.

 

Shaul Magid
Shaul Magid teaches Modern Judaism at Harvard Divinity School and is a senior research fellow at the Center for the Study of World Religions at Harvard. His latest books are Meir Kahane: The Public Life and Political Thought of an American Jewish Radical (Princeton University Press, 2021), and The Necessity of Exile: Essays from a Distance (New York: Ayin Press, 2023). He is an elected member of the American Academy for Jewish Research and the American Society for the Study of Religion.
Global Currents article

CRISPR-Cas 9, Practical Wisdom, and Human Identity

 

US Customs and Border Patrol chemist reads a DNA profile to determine the origin of a commodity. Photo Credit: CBP Laboratories Photography, 2006.

The scientific evidence suggests that the CRISPR technique is more precise than older, cruder techniques of genetic engineering.[1] Most scientific discussions about CRISPR are likely to lean towards medical applications, especially its seeming promise with respect to currently incurable human diseases and its use as a tool in the knowledge of human genetics. The broader public debate has narrowed its focus to application questions in human genetics alongside worries about the slippage towards human enhancement, and associated issues of justice concerning access. Many of the specific ethical questions that arise in advisory bodies are the same ones that are already all too familiar to those who have worked in the ethics of human genetics, namely those questions on safety, scope of usage, and means of achieving the end sought. That is, such bodies are most comfortable dealing with issues like safety, which amounts to a thin version of ethics that misses thicker ethical concerns.

What I want to offer here is not a claim to provide a possible consensus between such disparate groups, but rather a way of thinking which can help to rescue our ability to talk about more than safety and efficacy. This approach retrieves the virtue of practical wisdom, not least because I believe that it is relevant even more now that speculation about the possibility of accurate human gene editing is closer to becoming a reality.

Thomas Aquinas considered that there are small steps that ordinary people could take in order to acquire virtues, even those who did not necessarily have any particular religious faith. And crucial to those small steps is the exercise of practical wisdom. Practical wisdom is a source of insight and is a virtuous disposition that is particularly useful in the conduct of ordinary human affairs. As it is aimed at the common good, it can be applied in specific circumstances in different ways. Hence, a particular decision that follows the exercise of practical wisdom takes into account multiple factors in making that decision, even while keeping an open eye on whether that decision serves to achieve the goal of the common good.

While the moral virtues, such as justice and courage, on their own will incline their possessors towards right action, this inclination is not sufficient, which is why practical wisdom is so important. Practical wisdom helps to recognise those subtle differences that lead to a different course of action in given circumstances. Part of the challenge for CRISPR-Cas9, as with any other new and potentially influential technology, is that ethical decision making should not consider only the implications for one individual or family, but should also consider the wider socio-political implications. Truly moral decisions are not based on autonomy alone.

Practical wisdom, for Aquinas, has eight qualities, all of which are important in making a good decision. These qualities are: memory, teachableness, acumen, insight, reasoned judgement, foresight, circumspection, and caution. Memory (memoria) must be “true to being.” And it does not take long to realize that historical reflection forces a closer look at the long shadow of eugenics in the application of genetic science, a manipulation of human reproduction and discrimination against those with disabilities for ultimately political ends. Teachableness (docilitas), or open-mindedness, is a quality that many scientists will respect, since without open-mindedness discovery is much more difficult. But it is also a reminder that decisions are always embedded in complex networks of human needs and interests.

Acumen (solertia) includes the ability to act clearly and well in the face of the unexpected. Acumen makes it possible to act aright even when the time to make a decision is compressed. Insight and reasoned judgement, which are also in the list of intellectual virtues that practical wisdom requires, need to be brought to bear. Yes, some readers will now ask questions regarding, for example, whose insight and which reasoned judgements are assumed in such an account, but these questions do not undermine the effort to discern what should be done. What seems reasonable to one may not be to another, but in so far as prudential reasoning includes deliberation, it tries to take into account different reasonable points of view.

Bible, Photo Credit: MyfanwyX

What additional elements need to be in place for practical wisdom to be possible? The first element here is foresight, which is the human corollary of divine providence, since divine providence always aims at the ultimate good, while foresight seeks to imitate that orientation. Foresight is the ability to know if certain actions will lead to a desired goal. The judgements of practical wisdom are not fixed or certain in ways that might be the case if it were simply an application of rules or principles. This component is crucial for judgments about CRISPR-Cas9, especially in view of the fact that many of the so-called predictive beneficial effects have not come to pass in genetic medicine. Is this newest and what looks like the most promising technology an exception to that trend, or is this yet another example of over-enthusiasm in the wake of a new and exciting discovery? Are the uncertainties sufficiently strong to be tolerated or not? And who will be the major beneficiaries?

Human Genome, Photo Credit: Adam Nieman

Aquinas also includes circumspection and caution in the list of the components of practical wisdom. Circumspection is the ability to understand the nature of events as they are now, while foresight is the ability to understand events as they might be in the future. The difficulties with CRISPR-Cas9 are that it is very hard for a non-specialist to fully understand what is, in fact, certain knowledge and what is less so. Caution has to do with imprudent acts that are too hasty, and avoiding obstacles that might get in the way of sound judgements, though caution that leads to inaction is not really what Aquinas had in mind either. In this sense, freezing all action due to an over-inflated sense of caution may not be appropriate, but caution has to keep in mind the overall trajectory of scientific research in this field. Caution here refers not just to safety issues, but wider more substantial questions about the kind of human community that is envisaged—in other words, what human flourishing actually means. In addition, Aquinas also recognises the place of gnome, that is, the wit to judge when departure from principles is called for in given situations.

Practical wisdom as setting the mean of the moral virtues is concerned with individual prudential decisions. But practical wisdom reaches beyond this in order to inform political governance. While Aquinas’s discussion of practical wisdom bears some relationship to that in Aristotle, in this respect it is different, for Aristotle confined his attention to individuals. The common good is that which is related to the good of all and the good of each, and in Aquinas’s time it meant the state. While the rule of nation-states are more complicated now with international laws, and the power of transnational companies exceeds that of some states, the overall intention of political practical wisdom towards the common good still applies.

Part of the contestation of CRISPR is related to questions about what that good means, and for whom. In other words, what does it mean for a human community to flourish? Aquinas is also more communitarian compared with the individualism that prevails in the current climate, so when individual practical wisdom clashes with economic or state practical wisdom, the former has to give way to the latter. Distributive justice and political practical wisdom work together for the same end though they can be distinguished in their role. It may be that the rhetoric of the “common good” was once used to promote eugenic practices. But in the current context of deliberations over the use of CRISPR technologies, using such technologies to promote racial purity by a powerful elite for their own particular ends would be necessarily excluded. Hence, rather than opposing eugenic practices by avoiding any collective sense of what the good might require and resorting to individual autonomy as the way forward, a more promising approach is to insist on a greater scrutiny of what social, political, and collective goods require using the tools of distributive justice and political practical wisdom.

Just as individual practical wisdom sets the mean for the moral virtues, so political practical wisdom sets the mean for distributive justice. Distributive justice is concerned with the relationship between the community and individuals, but what this distributive justice might require is not self-evident in all cases, and needs to be supplemented by political practical wisdom in much the same way as correct decision making for the moral virtues must be supplemented by individual practical wisdom.

Political practical wisdom is one way of helping to heal the rift between public and private morality, and the false divide between a “subjective” virtue ethic that is concerned with individuals and principled “objective” approaches that are more often concerned with wider social contexts. This is particularly significant in adjudicating heated public contestations regarding CRISPR technologies, since much of the discussion seems, like many other controversial issues, to rest on key exemplars which provide the basis for lobbyists either in favor or against this technology. Take, for example, the case made by Erika Check-Hayden based on the example of Ruthie Weiss, who has albinism and who has appeared in media reporting on CRISPR. Check-Hayden reports that when you ask patients like Ruthie, or her parents, if they would they have used CRISPR to prevent albinism, the answer is a resounding No. Why? Because what makes Ruthie Ruthie is the challenge she has faced and the particular determination to live in spite of these disadvantages.

Poignant though this story is about the virtue of perseverance in the face of hardship, I am less convinced by arguments of this type. This is because the arguments rest on a particular subjective experience of an individual who suffers from a particular disability. Was it prudential for the parents to indicate that Ruthie should not have been engineered? Of course, simply from the parental perspective, given that Ruthie’s life was viewed as positive, they would not have wanted Ruthie to be anything other than who she is. Their memory of the positive aspects of her life informed judgments about what was right to do. But what if both Ruthie and her parents had suffered inordinately from her condition and could imagine doing virtually anything to change it? In that case, the option of CRISPR could well have seemed prudential to those parents. The point is that prudence takes into account not just our subjective feelings and experiences but wider societal constraints, circumspection includes knowing all the details from many different perspectives, so familial anecdotes are insufficient to make public policy. Further, assuming, as the parents did, that Ruthie would have been changed for the worse, does not really understand the nature of genetic engineering. Ruthie would have been a very different child if engineering had been permitted, so it would be virtually impossible to project back into the past and ask if some of her unique characteristics could thereby be compromised. The voices of those who have been excluded from discussion certainly need to be taken into account, but as a way of informing wider discussion rather than resting on a few emotively charged media-driven examples.

Bookcase with Human Genome. Photo Credit: several_bees

Practical wisdom applies to different levels; the level of the individual, yes, but also at the level of the family, the community, and the state or system of governance. Such an approach which stresses a movement away from isolating the individual towards complex multivalent levels in envisaging the good applies whether or not a specific Christian and Thomistic understanding of that good is sought. To be clear: individual goods in the approach I am arguing for are not denied, but such goods are sought within a much broader context of what that good might mean as embedded in specific social contexts operating at different levels. Bigger questions that relate to that part of practical wisdom called foresight include taking account of broader consequences, such as whether the technology is desirable at all for the common good; thus, who is really going to benefit from the use of the technology, what implications are relevant for a given community, what impact such applications might have on the use of resources, and so on are just as important. Which population groups will be used in clinical trials that will inevitably be set up to test efficacy, such as gene technologies that work to “correct” AIDS or other immune deficiency diseases such as Severe Combined Immune Deficiency (SCID)? Single gene diseases such as Tay Sachs may seem obvious as a first step in the application of CRISPR-Cas9, and may even be preferable for conservatives since the manipulation will be on sex cells rather than the embryo, but a prudential decision in a given community will also place such seeming advantages in a wider social and political context. Practical wisdom also helps to judge what the virtue of justice requires in given circumstances in so far as it is orientated towards the common good. It seems highly likely that the most vulnerable will be the target of any such campaign for trials in the lead-up to large-scale application in therapeutic treatments. Are all such treatments necessarily desirable as ends to promote overall human flourishing or not? However, in the Thomistic tradition practical wisdom provides the means, at least, to attempt to take account of a multiplicity of factors in decision making, including what such “balance” might look like in practice; for example, by giving moral priority to the weak, but not just those who are suffering various diseases.

Practical wisdom is not a panacea, but it may be an important alternative to the idea that all we need to do is apply fixed principles such as individual autonomy to ethical problems that are, at root, the same. A broad framework for decision making through a prudential lens acts as a guide that is less about absolute rules of right or wrong and instead concerns taking appropriate responsibility for human flourishing as perceived according to specific virtues of the human community, namely those virtues of practical wisdom, charity, compassion, and mercy.

 


[1] This blog draws on ideas that are further developed in the following: Celia Deane-Drummond, “The CRISPR Challenge and the Beatific Vision: Recovering Practical Wisdom as a Guide for Human Flourishing,” in Eric Parens and Josephine Johnson, eds., Human Flourishing in an Age of Gene Editing (Oxford University Press, 2019).

 

Celia Deane-Drummond
Celia Deane-Drummond is Director of the Laudato Si’ Research Institute and Senior Research Fellow at Campion Hall, University of Oxford. She is also Visiting Professor in Theology and Science at the Centre for Catholic Studies, University of Durham. She was previously Professor of Theology and Director of the Center for Theology, Science and Human Flourishing at the University of Notre Dame. She holds two doctorates, one in plant science and one in systematic theology. Her research is at the intersection of theology and theological ethics and the biosciences, including evolutionary anthropology, evolution, genetics, psychology and ecology.
Global Currents article

Catholic Conceptions of Personhood and Gene Editing

Human Genome Project – Smithsonian National Museum of Natural History. Photo Credit: Richard Ricciardi

We live in an age where we have become accustomed to the constant onslaught of technological interventions in our lives. Some of it is positive, enriching, and conducive to our flourishing; much of it is frightening. In the arena of medicine, we fear the possible consequences of these interventions—consequences to our physical bodies, but also consequences to our identities. Our identities as persons cannot be separated from our bodies; nor can we imagine that they are formed in isolation from other persons. Yet, with knowledge of genetics we confront the reality that much of “who” we are is shaped by the matter that makes up our physicality. For that reason, interventions directed at our genes elicit great alarm, especially when those interventions tamper with germline cells ensuring that the effects will be passed on to future generations.

Genetic interventions are not new. Since at least the 1980s, we have been able to manipulate genes directly and indirectly. This manipulation is mostly driven by medical science with its goal of treating diseases and reducing human suffering. Yet lurking in the shadows is the worry that this goal could easily be put to nefarious uses. The most recent development in this arena is what is referred to as CRISPR-Cas9 (clustered regularly interspaced short palindromic repeats), a technique of genome editing[1] that provides an easily accessible way to directly change the makeup of individual genes. CRISPR has been described by many as a radical development in genetic engineering, mainly because unlike earlier interventions it allows us to modify individual genes in a highly targeted and cost-effective manner. As with all earlier advances in genetic technologies, the hype distorts the reality. For example, an online news account describing how CRISPR works has the following subheading: “Everything you need to know about the genome editing breakthrough that one day could cure disease, eradicate species and build designer babies.” One can see in such a quote the way that important distinctions, such as between treatment and enhancement, are elided.

In this brief comment, I will bracket the possibility of nefarious uses of this technology and ask instead how even positive uses of this type of intervention (to cure disease, for example) might affect human identity. My question will be framed by Roman Catholic conceptions of identity and personhood. Insight into the Catholic view is especially important in light of the perception that the Catholic church rejects bio-technological interventions. Understanding how this particular faith community navigates a path between fear of technological overreach and the pursuit of medical advances allows us to see the complexity of the relationship between human identity and genetic intervention. There are three features of identity central to the Catholic view that can also illuminate public discussions about the future of genomic editing and its impact on human identity. These are: humans exist in a tension between dependence and responsibility, humans are relational, and humans are embodied. As I shall argue, properly understood, all three of these features can serve as the scaffolding for a society that values justice.

Human identity in the Catholic context is derived from the understanding that humans are created in the image of God, meaning that humans possess rationality and the capacity to know and pursue the good. Humans exist in a tension between being creatures of God and “free” agents who are responsible for their actions. Hence, they are both dependent and free. This status grants humans the ability to pursue their moral good while also leaving them the possibility to turn away from it. Human identity and personhood are thus premised on this capacity. The choice to modify individual genes is a consequence of that fundamental freedom, but it also challenges the relationship of human dependence on God. Critics of genetic interventions often invoke the metaphor of “playing God” as a way to signal human overreaching, arrogance, and pride. This tension between dependence and freedom/responsibility ought to function as a sort of horizon against which to understand moral choices, both individually and communally. As individuals, our actions are constrained by this tension; yet when expanded to society, the tension forces us to reflect more deeply on what it means to be responsible to our fellow humans. Applied to genomic editing, this suggests a prudent course which expands the range of consequences of our actions that count as relevant. Thus, for example, responsibility to persons on the margins of society, who are least likely to benefit from these interventions, must be a driving force in our moral reflections on this issue.

Depiction of the Trinity on the portal of the Basilica of St. Denis, France. Photo Credit: Rebecca Kennison, 1990.

Also fundamental to Catholic conceptions of personhood is the connected idea that the person is relational and communal. The moral injunctions to love one’s neighbor and to do justice presume that human identity is shaped by and for interactions with and responsibility to others. For Catholics, there is a theological backdrop to this notion of relationality—one grounded in the idea of God as three persons in the trinity, suggesting that God, in his very essence, is relational. Yet, even without the theological backdrop relationality implies human caring. It connects caring for the other with the inclination of all humans to self-preservation. Our bonds with other humans drive us to pursue medical technologies. One common concern about recent developments in genomic editing is that they threaten to undermine our bonds to future generations by altering future genomic maps. This concern extends the idea of relationality to a different temporal horizon.

In some sense the most fundamental Catholic belief about the nature of human identity is the emphasis on the embodied nature of human existence. Beyond a mere statement of fact, this claim is normative insofar as morality is experienced in and through the body. Yet, making the body central is not to suggest that the human is merely a whole made up of its component material parts. The body is endowed with meaning and it also creates meaning. While the Catholic tradition’s relationship to natural law is complicated, there is strong agreement that the body can provide information, and perhaps even guidance, for determining morally appropriate actions. Exactly how this happens is complicated by the fact that bodies are mediated culturally and the meanings we derive from them are shaped by broader webs of meaning. It is possible to see this third feature as existing in tension with the other two.

All three of these features of human identity support the centrality of respect for human dignity to Catholic ethics. Genome editing’s ability to manipulate an individual’s genetic identity can easily be seen as an assault on human dignity, especially if dignity is conceptualized as material integrity or wholeness. Yet, a different picture emerges when one expands integrity to mean well-being and human flourishing in a community governed by the norms of justice. Put differently, we must be vigilant to maintain the first two features of human identity alongside the third one.

These accounts of human identity and dignity to tell us that tampering with the physical building blocks of the human person has far-reaching consequences that threaten to disrupt the essence of the person. Yet, do they provide us with sufficient evidence to support the view that genome editing ought to be morally prohibited? Earlier debates about the ethics of genetic technology focused on drawing lines between germ cells and somatic cells, and between therapy vs. enhancement.[2] Those lines were intended to protect any perceived threats to human identity—threats that might embolden humans, leading them to forget the tension in their relationship to God, or to lose sight of their fundamental relationality, or to mistake their embodiedness as a purely material construct. The news in late 2018, that scientists had succeeded in editing the embryonic genes of twins born in China suggests that this technology will not disappear. The question for moralists is whether or not the moral arguments deployed in earlier conversations about genetic technology will prove sufficient to the task of addressing this latest twist. My view is that drawing this line between shorter-term somatic cell interventions and irreversible germ cell interventions is still a prudent course of action as well as a morally sound one. The three features of Catholic thinking about human identity can function as groundwork for the line-drawing task by reminding us of human limits, communal commitments (to the present as well as future communities), and the meaning of embodiment. While the ease and availability of CRISPR technology makes the line both harder to draw and to hold, it is important that we not lose sight of these fundamental views about identity.

 

[1] It is referred to as editing since it enables the scientist to target specific parts of the DNA sequence that have been identified using letters of the alphabet. The question of the implications of the editing metaphor is important to consider. The Nuffield Council report issued in 2015 addresses this important point and draws attention to the reductionism as well as the “overstretching” that can result from relying too heavily on a metaphor. Nuffield Council on Bioethics, “Genetic Editing: An Ethical Review,” (London: Nuffield Council on Bioethics, 2016).

[2] Kelly E. Ormond et al., “Human Germline Genome Editing,” American Journal of Human Genetics 101, no. 2 (August 3, 2017): 167–76, https://doi.org/10.1016/j.ajhg.2017.06.012.

Aline Kalbian
Aline Kalbian is Professor and Chair of Religion at Florida State University. She has published on a range of issues in bioethics, sexuality, gender, and moral theology. She is also co-editor of the Journal of Religious Ethics.
Global Currents article

Unmasking Neoliberalism’s Invisible Grip: Homo Economicus and the Person in Bioethics

Biohacker’s Handbook with Oura Ring Size Kit, DanteLabs DNA Analysis Kit, 23andMe Saliva Collection Kit and Recover CBD. Photo Credit: Marco Vercho, 2019.

Much has happened with gene-editing since Contending Modernities’“Out of the Lab” podcast. Despite the National Academies of Sciences, Engineering, and Medicine’s 2018 recommendations that gene-editing should be stringently regulated and only used for a limited number of somatic diseases at this time, a surprisingly stunned world witnessed the birth of twin CRISPR-Cas9 edited girls in China in November, with a third baby on deck. Voices across the spectrum—scientific, ethical, theological, policy-oriented—excoriated the researcher, He Jiankui. Repeatedly described as a “rogue scientist,” it now appears that He may have had at least one US collaborator.

Listening to the above commentary, a trained ear might hear a pattern, a subtle but regular pulse, that signals the heart of the matter. Where Adil Najam fears a “gap” between the ethical, policy, and “entrepreneurial realities” surrounding technologies like gene-editing, I would suggest that these are, rather, all neatly aligned.[1] To put it pointedly: the CRISPR conversation makes clear that bioethics, as it has emerged since the 1980s, is a deeply neoliberal project.

This is a big claim—one that can hardly be thoroughly argued in a blogpost. A complete argument would require detailing the intertwined histories of neoliberal economics and bioethics as they emerged post-World War II. Here I will only point to four notes that resonate throughout the literature. When taken together, they sound the dissonant chord of neoliberalism. These are: CRISPR as a technique, concerns about commercialization, dyspepsia about regulation, and the framework of bioethics itself, particularly its understanding of the person.

Neoliberal Bioethics

First, the briefest primer on neoliberalism. Bruce Rogers-Vaughn, in his important book Caring for Souls in the Neoliberal Age, defines neoliberalism as “the free market ideology based on individual liberty and limited government that connected human freedom to the actions of the rational, self-interested actor in the competitive marketplace” (2).[2]  Arising in the early 20thcentury, neoliberalism emerged in full force in the late 70’s-early 80’s with the Reagan-Thatcher era and the Washington Consensus. Central tenets include the liberalization of trade barriers, privatization of social services, globalization, and deregulation. In order to limit government, neoliberalism calls for sharply reducing or eliminating social services and welfare programs. The “social” is perceived as a mythic restraint on individual freedom. Neoliberalism aims to maximize the freedom of the individual, homo economicus—a person whose fundamental activity is choice and who chooses the good as she-or-he defines it based on a rational calculation of pure self-interest. Society is little more than an aggregate of autonomous individuals each pursuing their own good. Notably, however, freedom is redefined in market terms.

Vaccine-based cancer immunotherapy from novel nanoparticle systems. Photo Credit: NIH Image Gallery, 2016.

Neoliberalism is not simply an economic theory. It is a cultural project that subtly and pervasively organizes contemporary life. Rogers-Vaughn, in tracing how neoliberalism has transformed psychiatry, provides a template for making visible how it has likewise altered other areas of medicine and clinical research. CRISPR-Cas9 embodies a new approach to thinking about diseases, social problems, and human identity that he refers to as “methodological individualism.” Since roughly 1980, when mental illness was reconceptualized in the DSM-III, through “gene therapy,” stem cell therapies, the BRAIN initiative, neuroscience, and individualized or personalized medicine, a subtle shift has occurred that locates the source of diseases or problems within particular individuals rather than within social or political structures. Illness, here, is conceived as highly individualized, rooted deeply in the nano-loci of personal biology—genes or neural signatures. This new etiological framework drives a search for “biologically-mediated person-specific treatments.”[3] CRISPR envisages the human genome as a biological text that needs “editing.” There lies the problem. Having defined disease as biologically mediated, the medical-industrial complex then hunts for biological interventions that can efficiently fix mistakes that are located at the deepest level of our being—or, via enhancement, that shape our identities.

Though justified by the goal of reducing suffering, a second neoliberal commitment catalyzes the hunt: economic efficiency and maximizing profits. In the podcast, Maura Ryan raises concerns about “commercialization.” Aline Kalbian repeatedly refers to CRISPR’s “entrepreneurial aspect” and our free market, competitive context. He Jiankui’s motivation for creating the CRISPR babies was “personal fame and fortune.”  Others in the Contending Modernitiesseries raise concerns about commodification. But exorbitant prices, pervasive commodification, and a focus on market share and ROI are not accidental. They are the result of intentional neoliberal policies. The 1984 Drug Price Competition and Patent Term Restoration Act transformed the pharmaceutical market. In 1985, the FDA approved, for the first time, direct-to-consumer marketing for medical products. In 1989, NIH established the Office of Technology Transfer to maximize the financial profits of government-funded research. The list could go on. Moreover, via Gary Becker and the Chicago School, the market extends to an ever-wider array of social realities; the market becomes, in the catchphrase of Freakonomics, “the hidden side of everything.”

Kalbian notes in the podcast that commercial aspects of new medical technologies are not being regulated. David Baltimore, chair of the National Academies’ committee on gene-editing, laments the “failure of self-regulation in the scientific community” in the CRISPR babies case. But we should not be surprised. As Michael Fitzgerald more realistically states in “Out of the Lab”: “regulation gets in their way.” Deregulation, as mentioned earlier, is a central neoliberal platform. Regulations, characterized as the demon of big government, constrain the market’s freedom. Rogers-Vaughn notes a concerted movement, beginning in the late 70s, to make “governments reduce or withdraw laws and rules requiring corporations to consider any purposes other than pursuit of profit.” In the mid-1990s, when I served on the Recombinant DNA Advisory Committee, Big Pharma was a visible presence at our quarterly meetings, exercising a watchful eye over ethicists or community members who might seek to put limits on R&D.

Almost to a point, current analyses of gene-editing reprise those 1990s debates. CRISPR-Cas9 is essentially gene therapy 2.0. New technologies are more efficient and likely more efficacious than adenovirus vectors. But the same ethical arguments were made in the 1990s as now; the same guidelines were put in place. The bioethical framework has not changed. From the National Academies to ethicists and analysts, the debate remains mapped by beneficence, non-maleficience, justice, and respect for persons, pastiched over a bedrock of utilitarianism. Or…is it respect for persons? As I have narrated elsewhere, 1980 is not only a key moment in the history of neoliberalism. It is also a key moment in bioethics. For in 1979, another subtle but important shift occurred: Belmont’s respect for persons morphed into Beauchamp and Childress’ respect for autonomy.

The “person” as a regulative concept in medical ethics emerged at a particular historical moment: post-War Europe, first gestured at in the Nuremburg Code in 1948.[4]  (Is it a coincidence that second phase of neoliberalism begins around 1950?) Imported to the US in the late 1960s after a series of research scandals, “personhood” becomes integrated into the emerging bioethics discourse with Paul Ramsey’s Patient as Person in 1970. Initially, “personhood” was protective—seeking to stem research abuses against vulnerable populations (children with mental illnesses, African-Americans), to counter medical paternalism, and to resist the ‘‘depersonalization’’ of modern medicine. From Nuremburg through Paul Ramsey to the Belmont Report, the term “person” was invoked to ensure that autonomous persons were given the right to informed consent—whether for research or medical care—and non-autonomous persons (or “all who share human genetic heritage” in the language of the National Commission’s 1975 Report and Recommendations: Research on the Fetus) were protected, even to the point of excluding them from research that could potentially benefit others.

From Personhood to Homo Economicus

But in 1979, almost before the ink is dry on the Belmont Report, respect for persons transmutes in Beauchamp and Childress’ first edition of Principles of Biomedical Ethics into respect for autonomy. Henceforth, talk of persons becomes largely “permissive”—we now have to determine who counts as a person before we can determine what, if any, responsibilities we owe them. Knowing who counts as a person helps resolve dilemmas around abortion, end of life, organ transplantation, stem cell research, etc. Most interestingly, “persons” for bioethics come to be defined as autonomous subjects who express their agency through the rational act of choosing whichever ends further “their own good,” maximizing their own self-interest. Social determinants of health, social location, social structures, even family members rarely enter this calculus. The “person” of bioethics post-Beauchamp and Childress, post-1980, is homo economicus.

In the gene-editing podcast, Aline Kaliban asked “what is it, exactly, that ethicists bring to the table?” While often the dignity or sanctity of persons is held up as a hedge against the endless encroachment of market forces in medicine, the attitude Pope Francis so aptly names as “the throw-away culture,” it may well be that the principles of bioethics subtly serve not as a corrective but rather as a tool of the market.[5] Lisa Cahill depicts science, economics, theology, and liberal democratic political discourse as “thick worldviews” that compete in our engagement around bioethics and health policy. But it’s not an equal playing field. History suggests that the thick worldview of the neoliberal paradigm underlies them all. It shapes bioethics, medicine, scientific research, and medical technologies. This is why it’s often hard to see what bioethics brings.

For-profit blood center in Louisville, Kentucky, USA. Photo Credit: EX22218 – ON/OFF, 2016.

Clarifying the neoliberal structure of bioethics and emerging medical technologies not only helps us understand the contours of the CRISPR landscape. It illuminates other disquieting dynamics. For example, certain technologies, once approved, become cast as morally normative. If one could eliminate a defective gene from your children using CRISPR, is one not morally obliged to do so? Belying the rhetoric of individual liberty, as neoliberalism evolves in the late 20thcentury, homo economics becomes subservient to that sovereign master: the economic dogma of rational, utility-maximizing self-interest. In a troubling inversion, what must be free now is not persons but the market.

Or why is it so difficult to advance the notion of the common good?  Perhaps the answer lies in one of the first steps in the creation of modern capitalism, that original act of privatization, the literal enclosure of the commons in England from the 16th century forward. Step-by-step, material “commons”—even our genomes—are no longer shared. They are patented, commodified (23andMe!), and used as raw materials to create new products for profit and consumption.

If this is the case—if biotechnologies and bioethics and bioethics’ concept of the person are intrinsically shaped by neoliberalism—where are we left with a technology like CRISPR? Such an angle doesn’t yield a simple thumbs-up, thumbs-down, or “we must stringently regulate this new and powerful technology.” Perhaps He Jiankui is not so “rogue” after all. Rather, perhaps the CRISPR babies provide a road-to-Damascus jolt to make us analyze not only a particular technological innovation but the way the infrastructure of bioethics may have enabled it. Let me point to three avenues forward.

Towards a Systems Analysis

First, it is time to begin to make these economic dynamics of biotechnology and bioethical issues visible. The Catholic social tradition is one of the main voices that has begun to do so. Beginning with the liberation theologians in the 1970s, through John Paul II who named the structures of sin of money, power, and idolatry especially in relation to globalizing technologies, to Pope Francis’ Laudato Si’ (following Benedict XVI’s Caritas in Veritate), Catholic social thought critiques the practices and effects of neoliberalism—particularly commodification, consumerism, and the exacerbation of economic inequality.

This lens needs to be brought to bear on bioethics. Few Catholic bioethicists have yet done so. These two “doctrinal” areas have too-long been siloed.[6] A social lens asks about the historical and social contexts of concepts. Why did a particular concept arise when it did? Whose interests did it serve? It uses not only the tools of theology and philosophy, but also carefully attends to history and the social sciences. It presses for analyses that are, in the words of Paul Farmer, “historically deep and geographically broad.” One central tool of this “social-analytic mediation” (as liberation theologians call it) is economics, particularly political economy. My colleague Michael McCarthy and I have begun to address this gap in our recent book Catholic Bioethics and Social Justice: The Praxis of US Healthcare in a Globalized World. Catholic social thought here joins an emerging cadre of secular thinkers.[7]  But much more work needs to be done.

Second, we need to move away from “single-issue” analyses that have long shaped bioethics (“Is CRISPR ethical or not?”) to broader systemic analyses. What are the connections between the CRISPR babies in China, the new career path of the “professional guinea pig”[8] in the US, the skyrocketing numbers of human research subjects globally,[9] and the serious toll that neoliberal economics has taken on health outcomes around the world by decimating social programs and local economies, just to name a few? (Rogers-Vaughn, for example, sees neoliberalism as causally responsible for an increase in mental health issues). The list could go on.

These issues are all of a piece, pointing to ways in which human bodies become the raw material for profit-making (or cost-savings), a reality woven into the fabric of bioethics and biotech itself. Coming to see this requires, as Pope Francis notes in Laudato Si’, not only hard intellectual work but also moral and spiritual conversion. Can bioethics be converted? Religious traditions—with their vision of thickly connected persons who develop and flourish integrally in communities—could well provide the lever to begin to shape a bioethics that privileges persons over profits. This would move away from a bioethics dominated by the methodological individualism of autonomy and enamored of the methodological individualism of technologies. It would provide a starting point for a radical conversion of our hyper-individualistic and extractive economic philosophy that inflicts austerity on the poor while licensing the almost unbridled creation of biotech products for consumption by the wealthy few.

But it is not only bioethics that needs to be converted. Conversion calls us to a new way of living. Might we declaim against the neoliberal splinter in the eye of He Jiankui while remaining happily blinded by the log of contemporary economics in every other aspect of our own lives?  The lens we turn on him, we must also turn on ourselves. As this conversation among Contending Modernities unfolds, it seems an opportune time to reflect on how not only religious convictions (i.e., about persons) but embodied religious practices, such as silence, simplicity, fasting, almsgiving, prayer, the Eucharist, offer the potential for unshackling us from the subtle but pervasive ways that neoliberalism shapes our lives. Perhaps here is the starting point for beginning to come to see the underlying engine driving ourselves, our culture, our bioethics, and biotechnology, and to thereby begin to unhand these interventions and very selves from neoliberalism’s invisible grip.

 


[1]  Contending Modernities,“Science and the Human Person Podcasts.”

[2] This definition draws on Daniel Stedman Jones, Masters of the Universe, (Princeton: Princeton University Press, 2012).

[3] Bruce Rogers-Vaughn, “Blessed Are Those Who Mourn: Depression as Political Resistance,” Pastoral Psychology 63, no. 4 (August 1, 2014): 503–22, https://doi.org/10.1007/s11089-013-0576-y.

[4] Joseph J. Kotva and M. Therese Lysaught, On Moral Medicine, (Grand Rapids, MI: Eerdmans: 2012).

[5] Charles Camosy, Resisting Throwaway Culture, (New City Press, 2019).

[6] Maura Ryan, “Bridging Bioethics and Social Ethics,” Contending Modernities, September 27, 2013,

[7] Global Pharmaceuticals: Ethics, Markets, Practices, eds. Adriana Petryna et al., (Durham, NC: Duke University Press, 2006).

[8] Carl Elliot, White Coat, Black Hat: Adventures on the Dark Side of Medicine, (Beacon Press, 2011).

[9] M. Therese Lysaught, “Docile Bodies: Transnational Research Ethics as Biopolitics,” The Journal of Medicine and Philosophy 34 no. 4 (2009).

 

M. Therese Lysaught
M. Therese Lysaught, PhD, is Professor at the Neiswanger Institute for Bioethics and Health Care Leadership at Loyola University Chicago, Stritch School of Medicine.  Her books include Catholic Bioethics and Social Justice (2019), On Moral Medicine: Theological Perspectives on Medical Ethics, 3rd edition (2012), and the forthcoming Chasing After Virtue: Neuroscience, Economics, and the Biopolitics of Morality, with Jeffrey P. Bishop and Andrew A. Michel will be published with the University of Notre Dame Press.  For more on her work, visit: https://mthereselysaught.com/.
Global Currents article

Turkey, White Supremacy, and the Clash of Civilizations

Civilization board game. Photo Credit: Syvanen

“Teddy, how can one obtain a fair maiden?” asks a white supremacist playfully to a toy teddy bear in a YouTube video. He then reports the teddy bear’s answer to his viewers: “in order to truly understand the nature of women, one must first retake Constantinople from the Ottoman Empire.” The connection between the sexism expressed in “obtaining” a “fair maiden” and the vanquishing of a “civilizational” enemy is a recurrent theme in contemporary white supremacy that flourishes online. This quest for civilizational/racial purity combines an interest in white women’s reproductive and sexual availability with concerns about demographic “replacement.” Consider one of Republican Rep. Steve King’s many controversial tweets: “We can’t restore our civilization with somebody else’s babies.” Consider the anger the Tree of Life synagogue terrorist felt towards the Jewish congregation, not simply because they were Jews, but also because they were progressive Jews, helping refugees to safety in the United States. Consider the title of the Christchurch terrorist’s manifesto: “The Great Replacement.” In the contemporary United States, such statements, openly identified as extremist, operate alongside state-sanctioned racial policies with similar demographic targets, including attacks on women’s reproductive rights, endless wars on terror, the racialized criminal justice system, and immigration policing, all of which are all taken to be politics as usual.[1]

Building on the presentations and discussions that took place during the April 16 flash panel organized by Atalia Omer, “Interrogating the Christchurch Shootings,” this blog post connects the gendered/sexualized nature of white supremacist theorizing to its mobilization of civilizational discourses. I argue that at this intersection of civilization talk and obsession with reproductive purity, one finds a toxic passion for an imagined Medieval past and an obsession with the Ottoman Empire/Turkey that echoes the “Clash of Civilizations” rhetoric promoted by earlier civilizational theorists such as political scientist Samuel Huntington. This conjunction raises important questions about how respectable forms of scholarship and white supremacist ideologies may bolster each other. In fact, this discursive continuum parallels the continuum operating between racist state policies and “lonewolf” racist violence considered beyond the pale of the law and requires inquiry alongside it.

 

The Will to Clash of Civilizations

In one of the (generally positive) reader responses to my book manuscript, an anonymous reader noted I had made rather too much of an outdated concept that had already been criticized to bits:

“I do think discussing Huntington is fine (though it is a bit well-worn), though I was surprised when I came across phrases such as ‘the current popularity of Huntington’s thesis,’ which felt a bit dated.”

The reader was referencing my conclusion, where I emphasized how Turkey’s history of transculturation and prevailing debates about “westernization” in the country defied  Huntington’s “Clash of Civilizations” thesis, which claimed that the world consisted of “seven or eight major civilizations” whose culture-driven conflicts could be traced throughout history and would determine post-Cold War era politics. Huntington’s broad “cultural” categories mimicked prevailing ideas of race and, where they differed, were reducible to religion. In revising for publication, I changed the conclusion to acknowledge that academia had both thrown the book at, and closed the book on, Huntington. I noted that almost immediately after his thesis appeared, experts rallied to demonstrate how Huntington had underplayed “intracivilizational” conflicts as well as cultural hybridity, and had done so at a time considered the high age of globalization. Countering Huntington’s thesis with data was indeed low-hanging fruit for anyone who had studied any “civilization” in any depth. Moreover, scholars such as Mahmood Mamdani had come to question whether “culture” could ever be an explanatory factor for political conflicts. In revising, I acknowledged the strength of these critiques, but hinted that the scholarly reports of the death of “Clash of Civilizations” may have been premature:

“The sharp divide between the West and the rest distorts reality, but rhetorical attempts to locate such a divide are real enough. Culture Talk influences politics.”[2]

The afterlives of Clash of Civilizations have indeed been robust, with extremism, the security state, the military industrial complex, and knowledge-production boosting each other at a dizzying rate. Polemics around the civilizational status of Turkey vis-à-vis Europe have transitioned well into the new century, flourishing in peer-reviewed scholarship as well as in white-supremacist YouTube comments.

 

Istanbul, not Constantinople?

Turkey appears as a recurrent headache in Huntington’s work. In his 1993 Foreign Affairs article, Huntington called Turkey of the 1990s—a laicist, Muslim-majority NATO ally and a candidate for European Union membership—“the most obvious and prototypical torn country,” which could not decide whether it belonged within “Islamic” or “Western” civilization. Despite claims to objectivity and descriptiveness, his 1996 book turned prescriptive when it suggested that Turkish leaders might soon be ready to stop their “frustrating and humiliating” attempts to join “Western civilization.” Instead of acting like “beggars,” he predicted, the country would do well to “resume its much more impressive and elevated historical role as the principal Islamic interlocutor and antagonist of the West” (178). This piece of unsolicited advice did make Huntington some strong “civilizational” enemies in Turkey, although perhaps not of the kind that he had imagined. Instead, Huntington’s words got incorporated into Kemalist and leftist conspiracy theories, which claimed that the United States was overseeing a plot to “Islamicize” Turkey through the leadership of the “moderate” Islamist Recep Tayyip Erdoğan and his Justice and Development Party (AKP) in order to weaken the country.[3]

And voila, Turkey was Islamicized and became more antagonistic towards the West, right? Although the current AKP regime might have pleased Huntington in many ways with its “neo-Ottomanist” policies, it does not fulfill his thesis either. After all, it was under AKP’s early rule that major human rights packages were passed in order to make Turkish law comply with the Copenhagen criteria for membership in the European Union. Similarly, the regime takes as many pages out of the Western playbook—referencing the U.S. presidential system to justify the latest turn to an executive presidency, for example—as it does from age-old Ottoman legends. Moreover, the vision of the Ottoman Empire as somehow the antithesis of Western civilization itself would be contested by any serious historian who has passing familiarity with the cultural exchange and military alliances forged between European countries and the final Muslim Caliphate.

Yet, in Huntington’s suggestion that Turkey give up its “westernization” policies lies a reality more powerful than all our footnotes and primary documents can suppress: over the course of the twentieth century, “Western civilization” has become both a substitute and an alibi for “whiteness” and “Christianity” and Turkey constitutes a problem for all of these categories.

Hagia Sophia at Night. Photo Credit: Simon Q

“UNTIL THE HAGIA SOPHIA IS FREE OF THE MINARETS, THE MEN OF EUROPE ARE MEN IN NAME ONLY,” wrote the Christchurch terrorist in all caps in his manifesto, referencing a Byzantine cathedral that had been converted to a mosque under the Ottoman Empire and is now a museum. The phallus, the cross, and the sword: A mythic civilizational formula based on an an intersection of racism, sexism, and Islamophobia and, like the vision of a pure white Europe of the past, exists more in the minds of white supremacists than in historical reality.

In fact, the terrorist’s manifesto contains an entire section titled “To Turks.” Here the terrorist orders all Turks currently living in Istanbul (my city of birth and where my close relatives live) to retreat to the Asian side of the city, or face violence: “FLEE TO YOUR OWN LANDS, WHILE YOU STILL HAVE THE CHANCE.” He calls Erdoğan “the leader of one of the oldest enemies of our people, and the leader of the largest Islamic group within Europe,” in language recalling Huntington’s insult/praise for the Ottoman Empire as “the principal Islamic interlocutor and antagonist of the West.” References to Turks (and to massacred Bosniaks, whom he considered Turks) also decorated his assault weapon.

Why would an Australian terrorist living in New Zealand, a world away, be so obsessed with enforcing the boundaries between “Europe” and the Republic of Turkey? Because in our contemporary era, “Western civilization” operates as a code for whiteness and Christianity articulated in geographic and world-historic terms. As Alastair Bonnett notes in The Idea of the West, the concept of Western civilization has particular purchase in white supremacist theories about racial replacement because, in pointing to great achievements, it offers a fig leaf of loftiness to the insipid category of whiteness.[4] Thus, a simplified history of Ottoman clashes with nations mythologized as European ex post facto, with symbolic focus on Vienna and Istanbul, has gained new charge. In this construction, Turkey’s ambivalent status vis-à-vis the geographic subcontinent of Europe appears as an irritant.

In contemporary white supremacist theorizing, what does not belong in “Europe” must be excised at the altar of this construction that was never pure but must somehow be made pure. What Huntington once called “the bloody borders of Islam” turn out to be non-existent borders to be drawn in Muslims’ blood.

This is not to defend the now-excised phrase about “the current popularity of Huntington’s thesis” in that earlier draft of my book. The reader was right at the time. However, the so-called “Trump era” has clarified more than ever just how popular (in all senses of the term) calls to man the barricades of Western civilization have become in the present. In the age of social media, Huntington’s argument, itself a rehashing of older civilization talk, has found its viral legs and assault weapons.

Why did the solid scholarship that closed the book on Huntington’s thesis not work for so many? Is it because globalization turned out not to be the panacea it was promised to be? Or because our white supremacist political and socioeconomic structures necessitate the continuous redesigning and dissemination of white supremacist discourse? If historical and geographic reality are not enough to crush the Clash of Civilizations thesis, what is?

New Zealand has banned the circulation of the Christchurch terrorist’s manifesto, and Erdoğan has rightly been criticized for showing video footage of the massacre at campaign events to whip his supporters into a nationalist fervor. Yet censorship is not a safe option either. These texts travel, and so do their proponents. The Christchurch terrorist himself traveled around Europe and visited Istanbul several times. Whereas I see a complex racial and civilizational merging that can never be reversed in my city of birth, he saw a mistake to be turned into a battle cry. How do we deal with the convergence between Huntington’s subtle advice to Turkey and this terrorist’s threat “to Turks”?

 

Make Istanbul White Again

Figure 1. “Wait for us Istanbul.” Viral pro-CHP meme from early April 2019, comparing the appearance of the two mayoral candidates and their wives.

I end with a couple of memes and on a note of irony, as befits our terrifying hyperreal era: the racial/civilizational language that animated the Christchurch terrorist flourishes among Turks as well. In the lead-up to the recent municipal elections that led to AKP losing its hold over Istanbul for the first time in the party’s history, supporters of the opposition party CHP began circulating memes that praised the new mayor of Istanbul, Ekrem Imamoğlu, and his family for the way they look.

One meme contrasted a daytime image of Imamoğlu and his slim, youthful, blonde wife with a nighttime photo of the stout, mustachioed AKP candidate and his headscarf-wearing wife. “The candidates for Istanbul mayor and their partners,” it read, “On the one side, light; on the other side, darkness” (Figure 1). The light/dark metaphors did multiple duty here much in the same way they do in Western politics, referring to the timing of the two photos, mobilizing connections to modernity and backwardness, and dogwhistling colorism.

Figure 2. Viral Turkish meme from early April 2019, celebrating the new mayor of Istanbul and his family for looking like Northern Europeans.

A post-election meme celebrating Imamoğlu’s victory depicted his fair-skinned and light-haired family wearing “modern” clothing and exclaimed, “Bro, look at the family! In one instance, we progressed 100 years. We became like Finland, Sweden and Norway!” (Figure 2).

Expressing a half-earnest yearning to be finally become “like Finland, Sweden, and Norway,” such texts of digital folklore demonstrate that Turks themselves will not be left behind when it comes pushing back against Turkey’s well-earned racial and civilizational ambiguity.[5] However, while Western white supremacists insist Turks can never belong within whiteness and must be pushed out of Europe, Turkish white-supremacists see uncontested whiteness as within grasp, pending the election of political representatives with the right/white look. In both cases, women’s bodies are made to bear the burden of racial/civilizational proof.


[1] Scholarship investigating this continuum with regards to the War on Terror includes Leti Volpp, “The Citizen and the Terrorist,UCLA Law Review 49, no. 5 (2002): 1575-600, Michael Welch, Scapegoats of September 11th : Hate Crimes and State Crimes in the War on Terror (New Brunswick, N.J.: Rutgers University Press, 2006), and Inderpal Grewal, Saving the Security State : Exceptional Citizens in Twenty-first-century America (Durham, N.C.: Duke University Press, 2017).

[2] Gürel, The Limits of Westernization188. See Chiara Bottici and Benoît Challand, The Myth of the Clash of Civilizations (Florence, Ky.: Routledge, 2013).

[3] Gürel, The Limits of Westernization136. See also Emre Kongar, ABD’nin Siyasal Islam’la Dansı (Istanbul: Remzi, 2012), 28-34.

[4] Alastair Bonnett, The Idea of the West: Culture, Politics and History (New York: Palgrave Macmillan, 2004), 26-8.

[5] For an excellent historical discussion of the Turkish will to whiteness, see “Is the Turk a White Man?”: Race and Modernity in the Making of Turkish Identity by Murat Ergin (Brill, 2016)

Perin Gürel
Perin E. Gürel is associate professor of American Studies and concurrent assistant professor of Gender Studies at the University of Notre Dame. Her first book, The Limits of Westernization: A Cultural History of America in Turkey (Columbia University Press, 2017), explores how Turkish debates over “westernization” have intersected with U.S.-Turkish relations in the twentieth century. Her work has appeared in American Quarterly, American Literary HistoryJournal of the Ottoman and Turkish Studies Association, the Journal of Transnational American StudiesJournal of Turkish Literature, and elsewhere. She is currently working on a second book project on the impact of U.S. political discourses on Turkey-Iran relations from the Cold War to the War on Terror. Gürel is also faculty fellow for the Kroc Institute for International Peace Studies and the Nanovic Institute for European Studies at Notre Dame. 
Global Currents article

De/Provincializing Europe

One of the most curious aspects of the Christchurch mosque shootings is that Brenton Tarrant, the twenty-nine-year-old accused of murdering fifty-one worshippers, was so obsessed with medieval Crusades, as well as early modern clashes between Christian Europe and the Ottoman Empire. Yet Tarrant’s infatuation with these histories, coupled with his unstinting allegiance to far-right extremism, is by no stretch unique. Indeed, Tarrant’s embrace of “ethno-nationalism” echoes the re-emergence in the 1910s of the U.S.-based Ku Klux Klan, which modeled its uniform on the confraternity robes worn by members of Christian religious orders hundreds of years earlier. White supremacists reasserted this dubious connection to the Catholic Church in the 1960s when the murderous White Knights of the Ku Klux Klan named its Mississippi chapter after religious military orders such as the Knights Templar and the Knights Hospitallers.

Pieter van Laer, The Flagellants (c. 1635), Munich, Alte Pinakothek

Despite Tarrant’s seeming awareness of these and other futile attempts to rationalize racial violence via Christianity, his overall comprehension of the particulars and nuances of European social history is utterly lacking. A telling excerpt suggesting as much from Tarrant’s seventy-four-page manifesto posted online prior to the Christchurch massacre stated: “The origins of my language is European, my culture is European, my political beliefs are European, my philosophical beliefs are European, my identity is European and, most importantly, my blood is European.” As these distorted ascriptions illustrate, Tarrant’s identification with the continent posits that the essence of European heritage is effectively synonymous with his own mythic “whiteness.”

This worldview, and its intersections with a global uptick in racial terrorism, underscores how deadly the resurgence of white supremacy and separatism in recent years has become. And in the case of Europe, various manifestations of these patterns suggest the risk of failing to acknowledge the continent’s history of social, cultural, religious, and ethnic diversity.

Despite marked progress since the 1960s in dismantling colonial frameworks as the predominant mode for interpreting European history—including Europe in the world—“clashes” rather than pluralism remains a standard trope across popular discourse. Referring to the “early success of multiculturalism in Britain,” and that nation’s attempt to “integrate, not separate” in a 2006 essay published by the Financial Times, Harvard University economist Amartya Sen lamented the abandonment of what he summarized as greater Europe’s “championing of every form of cultural inheritance.” Sen also warned: “The current focus on separatism is not a contribution to multicultural freedoms, but just the opposite.”

Fifteen years later, on the eve of Britain’s disorganized separation from the E.U., transmitting Europe’s complex, deeply interconnected multiracial heritage has never been more pressing. Because a small but growing cohort of scholars are already doing this work, there are several groundbreaking, positive, and multi-faceted studies from which to draw in pushing back against the spread of xenophobia. Moreover, despite a vague undercurrent of cynicism in the academy—e.g. political scientist John T. Scott’s review of historian Catherine Fletcher’s masterful study, The Black Prince of Florence: The Spectacular Life and Treacherous World of Alessandro de’ Medici—recovering and amplifying non-white voices, experiences, and representations in European Studies remains popular both in and well beyond the ivory tower.

Although Fletcher’s study—of the first black head of state in the modern Western World—does not make this connection, Alessandro de’ Medici’s African mother, Simonetta da Collevecchio, was likely named after Simonetta Vespucci (1453–1476), the Florentine model who was famously the muse of Giuliano de’ Medici. By the end of the fifteenth century, as the flourishing of black artistic representations of St. Maurice (like the two handsome images below), and an array of other cultural ephemera indicate, “whiteness” was far from the only standard of beauty in Europe on the eve of the so-called “High Renaissance” (1500–1530).

St. Mauritius (Inside) and St. Sebastian (Outside). From the former Marienaltar from St Peter’s Abbey in Salzburg (c. 1498), Vienna, Österreichische Galerie Belvedere
St. Maurice. Statue from a niche in the central panel, Altarpiece of the High Altar (c. late XV – early XVI century), Halle (Saale), Church of St. Maurice

While the Renaissance was indeed more racially pluralist and sophisticated than is commonly assumed, the cosmopolitan habits and perspectives associated with this period were not new. In fact, both the medieval and early modern periods are especially powerful time periods through which to understand Europe’s heterogeneity—not least due to the continent’s burgeoning fascination with African wealth, characteristics, and civilization from the late thirteenth century onward. Centuries before the ascendance of scientifically-based racism that followed the onset of the Atlantic slave trade, blacks were welcomed, if not cherished, at royal courts throughout Europe. And as the story of Alessandro de’ Medici’s marriage to Margaret of Austria (1522–1586), the daughter of Holy Roman Emperor Charles V affirms, “blackness” was clearly compatible with divine rule and princely virtue across greater Europe.

As Ivana Čapeta Rakić, Yona Pinson, and other scholars have pointed out, the Roman Catholic Church encouraged the positive incorporation of Africans and especially those who were Muslim—arguably the best evidence researchers have of uneven commitments to this ideal is iconography from the late medieval and early modern periods. Not only did a wide range of these artifacts mythologize the epic reach of the Catholic order, their promotions of a religious imaginary often stretched across three continents (e.g. the Biblical Magi, or Three Wise Men or Kings featured in the image below). However imperfect these gestures were, they symbolized the potential power of a global, egalitarian Catholic future, and a broader European interest in commemorating people of African descent that deserves more attention.

Detail Master of the Gereon Altar [Mary’s altar with Saints from St. Gereon at Cologne] detail (ca. 1420-1430), Berlin, Gemäldegalerie
Such initiatives were not extrinsic to the European Renaissance that Swiss art historian Jacob Burckhardt famously rediscovered in the nineteenth century. As Holy Roman Emperor Charles V and Alessandro de’ Medici’s uncle, Pope Clement VII were certainly aware, evidence of steadfast, interethnic bonds stretched back to Greco-Roman civilization. Yet even more so than the early modern period, twenty-first century misconceptions that these ancient civilizations were both “white” and “European” are even stronger. Modern societies need more scholars, educators and popular commentators challenging this outmoded paradigm. When the Roman Empire comes up in my undergraduate courses, the overwhelming majority of my students have never been taught that the post-Roman Republic stretched well into Africa. (Here too it is worth remembering that Alessandro de’ Medici was the second head of state in Italy with African heritage—Roman Emperor Lucius Septimius Severus [145–211 AD] was the first.) I also make it a point to explain that, one thousand years later, the Ottomans saw themselves as the inheritors of the Roman emperors, and perceived the Hapsburg Empire as infringing on this claim, especially after the seizure of Constantinople in 1453. Moreover, I qualify that in border communities, cultural practices between the Ottoman and Christian empires were rarely as divergent as dramatized histories of their geopolitical skirmishes in films and other forms of popular culture suggest.

Given the tendency of scholars and commentators alike to emphasize the enslavement, oppression, and defeat of people of color by—“white”—Europeans, it is easy to see how Tarrant fell for the overwrought cliché of the totemic clash between the Ottoman Empire and Europe. We can safely assume that Tarrant remains completely unaware that the Dutch Republic—led by the highly-commercialized County of Zeeland, which New Zealand is named after—began trading directly with Asia in the seventeenth century to ensure its own religious freedom.

In addition to the internally displaced Dutch populations who escaped religious persecution by moving from the southern to northern Netherlands at the end of the sixteenth century, Jewish communities expelled from the Iberian Peninsula were also beneficiaries of this haven. And the sheer abundance of this nation’s striking late sixteenth- and early seventeenth-century paintings of black Africans and other populations of color suggests that appreciating cultural pluralism was a standard rather than an exception in the early Dutch Republic.

Jan van Kessel the Elder, The Continent of Africa, detail (c. 1664-66), Munich, Alte Pinakothek

Recalling these progressive intentions is not an apologia for the ineffable destruction Europeans caused in the early modern period. While Portugal’s seizure of major centers of the Asian spice trade between 1507 and 1515—most notably in the Malaysian city of Malacca, the Iranian island Hormuz, and the Indian city of “Old Goa”—was devastating, Dutch rule was even more brutal. And the commercial trade in black bodies beginning in the fifteenth century is a chilling reminder of why visual culture could never challenge large-scale human degradation.

Inspiring and downright horrific legacies of the first era of globalization should not be presented in zero-sum terms—especially given how interconnected civics, economics, politics, religion, and culture were in this period. More so than any other purpose, the brief reflections above suggesting a roadmap for “deprovincializing” early modern social and cultural history are driven by my own commitments not to echo a standard trope: that the history of Europe is synonymous with a history of “whiteness.” Not only has this tenacious, yet incorrect, understanding skewed the public’s intellectual development, it continues to support exclusionary social norms that neglect the histories, presence, representations, and experiences of non-white persons altogether.

 

Suggested Further Reading:

F. Earle, Black Africans in Renaissance Europe. Reissue edition (Cambridge University Press, 2010).

Catherine Fletcher, The Black Prince of Florence: The Spectacular Life and Treacherous World of Alessandro de’ Medici (Oxford University Press, 2016).

David Northrup, Africa’s Discovery of Europe (Oxford University Press, 2013).

Revealing the African Presence in Renaissance Europe, Eds. Ben Vinson III, Joaneath Spicer, Natalie Zemon Davis, and Kate Lowe (Walters Art Gallery, 2012).

Korey Garibaldi
Korey Garibaldi is Assistant Professor in the Department of American Studies at the University of Notre Dame. He is currently working on a book project, tentatively entitled, Before Black Power: The Rise and Fall of Interracial Literary Culture, 1908–1968. The study traces six decades of fitful yet dynamic cross-racial literary collaborations, with a particular focus on projects authored and produced by black and white Americans. Most recently, Korey has co-authored an op-ed on access to Level-1 trauma care published by  The Chicago Tribune, and an article reconnecting Henry James and Gertrude Stein in the Fall 2018 issue of The Henry James Review. His book chapter, “Sketching in an Age of Anxiety: Henry James’s Morganatic Baroness in The Europeans” in Reading Henry James in the 21st Century: Heritage and Transmission  (Cambridge, 2019) will appear in July.
Global Currents article

Islamophobia Beyond Christchurch: Muslims as Tamed Others

Ruined building converted into a modern market (Bali). Photo Credit: Siegrid Saldaña

As Islamophobic tragedies grow more and more frequent and “normalized” in our day-to-day life, it has become easy for us to set aside “smaller” Islamophobic incidents while our attention is diverted to large-scale tragedies such as Christchurch. The New Zealand attack was easy for the media to digest, for it featured a white nationalist shooter who killed many Muslims while deliberately presenting a clear Islamophobic and racial motive. However, we also know that structural violence like Islamophobia does not only come up when a white racist opens fire in two packed mosques in the midst of Friday Prayers. Rather, Islamophobia is intertwined with state-sponsored violence and securitization, narratives of nationalism, and the flows of the global market and commodification. Drawing from my reflections at an April 16 panel on the Christchurch shooting, this essay will discuss how state securitization and market commodification perpetuate Islamophobia.

Today’s Islamophobia is more than just a specific form of racism. The scholar-activist Arun Kundnani and scholar of Islamophobia Tania Saeed argue that we should expand our understanding of racialization within hegemonic Islamophobic societies. Kundnani is concerned with the “racialization of socio-political, religious, and cultural contexts” that expands the exclusionary boundaries of Islamophobia. In this case, the definitions of “Islam” and “Muslim” are being re-drawn based on what is to be protected, and no longer on what is to be excluded. This means that we can no longer attend to one generic Islamophobic exclusionary line around “Muslim minorities in a White-Christian majority country,” but rather we must attend to multiple specific exclusionary lines that are drawn based on different threat perceptions in Islamophobic societies. In addition, this also means that “Islam” and “Muslim” have multiple meanings and signifiers (skin colors, attire, languages, customs and traditions, etc.) that are perceived as objective threats against the said Islamophobic society.

Looking into those contextualized boundary-drawings, Tania Saeed uses the term “degrees of alterity” in order to show the different limits of accommodation—or should we say “tolerance”?—that an Islamophobic society would express when dealing with those who deviate from the dominant normative order. In this context, the benchmark of acceptance would be the White Male Universal Self against whom the Others must be judged based on their “likeability”/ “acceptability.”

Seeing “alterity” on a spectrum, rather than in absolute binaries, allows us to probe the grey areas where Islamophobia is not readily apparent due to its intermingling with other forms of structural violence, such as those sponsored by both modern nation-states and the global market. This is the case because the spectrum of alterity takes into account the superficiality of “tolerance” and “diversity” that rarely genuinely embraces the presence of the Others, but rather commodifies it within the context of the Global Cultural Bazaar. This bazaar is understood as the global marketplace where global images and global dreams are disseminated and consumed through films, television, music, fashion merchandise, and other products. Furthermore, the spectrum of alterity is also useful in understanding a form of Islamophobia that comes into being within the framework of state-sponsored violence and securitization, which I will elaborate on later in the essay.

The Global Cultural Bazaar also can be seen as a globalized context in which articles and expressions of “cultures” and “religions” of the marginalized Others are sold and consumed by the privileged Self (Mohanty, 2017). The Global Cultural Bazaar is a perfect, abstracted context in which we can see how the “tamed alterities” of Muslims are commodified as products to be consumed. From Rumi’s poetry to designer-labelled sarongs, “Muslimness” can be tolerated (or even embraced) in so far as it is presented in neat consumer goods that serve to remind consumers of their positions as the Subject, that is, as owners of capital who are allowed to objectify those goods. In other words, reproducing the violent binary of “good Muslims, bad Muslims”, the Global Cultural Bazaar uses Islamophobia as a filter to distinguish tolerable alterities from intolerable ones.

In my field research among Indonesian female migrant workers (FMWs) in Singapore, I found that the expansion of Islamophobic commodification into the field of unskilled labor has turned even the Muslim FMWs’ religious praxis into yet another factor that reduces their “marketability” in the global market. Women I spoke with shared how Muslim FMWs are prohibited by their agencies or employers to wear their hijabs inside the employer’s house (their workplace as caregivers) in order to exude a “friendlier” face in their non-Muslim workspace. Other workers are tricked into and/or forced to eat pork in order to accommodate employers who find their unwillingness to consume a specific kind of meat to be somehow threatening. As the process of commodification can practically turn almost everything into products to consume, these FMWs are obliged, due to their positions as laborers in the global market, to increase their “market value” by taming some of their Muslim alterities.

“Say No to Burkas” Mural in Newtown Australia. Photo Credit: Newtown Grafitti

The oxymoronic tension between the voyeuristic preservation of Others in the Global Cultural Bazaar and the taming of absolute alterities is brilliantly encapsulated in Sara Ahmed’s term “stranger fetishism.” This specific way of relating to Others (or “aliens” in Ahmed’s language) allows for the Human Self to perceive the threats posed by the presence of the “aliens,” yet it also enables him, from a comfortable position, to admire the “non-human” qualities that the Others espouse. Not unlike a museum specimen, the Others in this context are to be seen, touched, discussed, and pondered, but never to be engaged with in human conversation.

When a Muslim Other does not live up to the benchmark of “likeability” (i.e., when a Muslim is a “bad Muslim” according to white, neo-liberal societies), her/his presence would fall under the securitization narrative that identifies him/her as the embodiment of everything that is “beyond human.” This means that the very embodiment of “Muslimness” is the ultimate signifier of everything that is foreign and hostile. It is also important to note that the identification of a Muslim as the ultimate other is gendered, both in its framings and in its implications.

The recent accusations of anti-Semitism against Muslim Congresswoman Ilhan Omar illustrate this final point well. Vanessa Taylor, in her analysis for The Intercept, argues that the singling out of Ilhan Omar in this case cannot be separated from the reproduction of imageries of an “angry black woman” with an anti-Black Islamophobia that associates Black Muslims with anti-Semitic tendencies. Furthermore, as a hijabi woman of color who was also a refugee, Rep. Omar could only be two things in an Islamophobic lens. She is either an oppressed Muslim woman of color whose “freedom” is curtailed by her innately patriarchal faith, or she is a “double-agent” whose hijab is the very reflection of her disloyalty to the cause of the modern nation-state.

Securitization of “Muslimness” (understood widely here as any sociopolitical, religious, and racial signifier that makes a person “Muslim” regardless of her/his real religious belongings or lack thereof) is a mechanism that nation-states employ in order to keep these ultimate Others in check, or worse, to completely eliminate them from within their borders. From body surveillance to ethnocide and genocide (think about the Chinese Uighur Muslims and the Myanmar Rohingya Muslims), many nation-states are committed to constructing “Muslimness” as one of the biggest “threats” to their existence, in spite of the many Muslims doing their best to showcase their national loyalties. From this perspective, the point is not whether or not Muslims are patriotic, but rather that their “Muslimness” renders them to be repositories of all that is alien. The Muslim Others will never be “Us.”

Lailatul Fitriyah
Lailatul Fitriyah is a Ph.D candidate and Presidential Fellow at the World Religions and World Church Program, Department of Theology, University of Notre Dame, Indiana. She holds a MA in International Peace Studies from the Kroc Institute for International Peace Studies at the University of Notre Dame. Her current research is focused on the construction of feminist theologies of resistance in post-colonial Southeast Asia, feminist theologies of migration, and feminist interreligious dialogue.