
Brain-machine interfaces (BMIs), also referred to as brain-computer interfaces (BCIs), have rapidly been gaining popularity as a topic of conversation for bioengineers and ethicists as well as for the public.[1] It is no surprise that a technology that promises to restore function to those who have lost it or who never had it in the first place would be so appealing. Headlines such as “Brain Implants Give People Back What They Lost” communicate that BMIs are a solution to neurological ailments such as paraplegia, which rob people of being able to engage with the world in the same fashion as those with fully abled bodies.[2] BMIs appear to be a very hopeful technology, but we must not be too quick to adopt such radical solutions to our ailments without proper consideration of the limitations and ethical concerns that may come with them. Rapidly evolving technology requires an enduring, biblically grounded ethical framework that can guide us in ascertaining how the technologies in question may impede or promote virtue, and consequently, human flourishing. As such, BMIs must be held up to a framework of virtue ethics as we attempt to establish which aspects of them are ethically permissible, if any. Doing so will inform further evaluation of how BMIs may be appropriately utilized in accordance with the virtues they may foster that embrace God’s design for humanity or vices they may incite that guide us away from his design and dishonor God himself.
BMIs were first devised in 1973 and quickly became a controversial area of research due to their alluring but intimidating potential.[3] With groundbreaking studies revealing the intricacies of neurons, researchers grew more interested in exploring brain structures and activity, particularly through the use of bundles of microelectrodes implanted in various locations in the brain.[4]
It was only a short time—less than ten years—before the capacity for recording neurons jumped from a few neurons at a time to one hundred neurons at a time.[5] This opened up a world of possibilities for neuroscience research. Implantable BMIs were developed during this time as well, with the original intent of analyzing the physiological structures of the brain. It quickly became evident that BMIs could be utilized not only for research purposes but also for practical applications such as neuroprostheses, assisting in the restoration of function for those who have impaired mobility—specifically, people with missing or damaged body parts or severe paralysis due to “trauma to the nervous systems, notabl[e] spinal cord injuries or neurodegenerative diseases.”[6]
At the same time, many people regard BMIs such as brain implants with trepidation, uneasy with the idea of a machine embedded into the human brain. Though this wariness about new technology is not unwarranted, it is the case that many other technologies we view as commonplace today were once considered just as unnerving. For example, the cochlear implant, first successfully implanted in 1961 by Dr. William House and approved by the FDA in 1984, was not initially well-received.[7] As it became further developed and increasingly used, many continued to meet cochlear implantation with skepticism and opposed its use despite its success.[8] Keeping this in mind, we can approach the issue of brain-machine interfaces more openly.
BCIs or BMIs operate by processing the real-time brain activity of the user to manipulate external devices.[9] Through mechanisms including a sensor, decoder, and translator, brain signals can be processed and applied to a device—for example, a computer, robot arm, or drone—in order to coordinate unconventional forms of communication that do not rely on physiological functioning.[10] Most BMIs decode the subject’s electrophysiological signals to determine the physiological intent.[11]
There exist invasive and non-invasive BMIs, both of which carry a set of immediate and long-term risks for the user.[12] Brain implants that are surgically interwoven into brain matter, such as those created by Elon Musk’s company Neuralink, would constitute invasive BMIs. These are more accurate than non-invasive BMIs but hold more risk due to factors such as surgical complications.[13]
The implications of being able to train human brains to interact physically with the world by merely thinking about an action are astonishing. Noland Arbaugh, the first person to have been successfully implanted with a Neuralink chip, has experienced great joy in his newfound ability to control a computer cursor through electrophysiological signal in spite of his quadriplegia.[14]
The effectiveness of BMIs can be attributed to the incredible neuroplasticity of the brain.[15] The brain’s malleability throughout the lifetime of a person makes it possible to build new neural networks continuously and allows for the brain to change in response to both environmental stimuli and activity within the brain itself.[16] For those recovering from a stroke, for instance, neuroplasticity plays a salient role in regaining motor function.[17] The fluidity of these rapidly evolving networks also means that synaptic connections between neurons can be altered and even eliminated should they go underused, establishing a “use-it-or-lose-it” phenomenon.[18]
Neuroplasticity uniquely equips the brain to perform motor tasks through a BMI in a way very similar to how it would when learning any traditional motor activity.[19] BMIs stimulate areas of the brain connected with natural sensorimotor training; instead of traditional sensory feedback being supplied, however, continuous information such as tactile and proprioceptive cues to simulate touch or physical sensations, intracortical microstimulation to trigger the somatosensory cortex, and most commonly, visual feedback, is sent to the brain via sophisticated technologies, equipping it to adapt to the interface.[20] BMIs such as those that enable users to control computer cursors most often utilize outgoing efferent signals to coordinate movements and are effective despite the lack of incoming afferent signals supplying sensory feedback thanks to the brain’s effective adaptation to the BMI.[21] Through continuous practice, users of BMIs can improve execution of motor skills even after initial difficulties operating the interface.[22] This practice is what makes it possible for humans to experience brain remapping and thereby regard actuators as extensions of the body rather than foreign objects.[23]
Famed entrepreneur and tech enthusiast Elon Musk has made significant headway in this field through his company Neuralink.[24] Unlike other brain implants, those created by Musk’s company consist of thousands of electrodes attached to threads that are precisely inserted into the brain and, remarkably, are wireless.[25] This surpasses other implants in that many cannot detect such specific neuron groupings and require external wires to function. It is no wonder then that Neuralink has seemed to revolutionize the world of brain-machine interfaces, even sparking competition that has led Chinese scientists to develop a similar, smaller wireless implant that may well prove superior once it moves beyond its early stages.[26] For those who experience physical disabilities that inhibit motor functioning, BMIs such as Neuralink have promised hope in being able to re-engage with society through digital communication and to participate in pleasurable activities such as chess or video games.[27]
At the same time, Musk proves overly concerned with technological progress for the sake of progress itself. While BMIs for those in whom disability impairs physical functioning seem considerably altruistic, Musk has also endeavored to engineer devices that push the limits of human-machine relations.[28] He has championed technologically progressive aspirations such as enhanced communication efficiency between people with the goal of creating a “symbiosis with artificial intelligence.”[29] Even more, he has insinuated that humans must merge with machines if we do not wish to succumb to the inevitable threat of domination that AI poses.[30] It is in this sense that Musk could be considered a technological messianist, or one for whom technology appears salvific.[31] To a person of that mindset, technology is not simply a tool to be used at the discretion of the person but a foreseeable extension of the self that must be embraced if we are to avoid the collapse of humanity. Without engaging with technology such as BMIs critically, however, one is at risk of overlooking severe ethical problems. What is further troubling, then, is that the negative aspects of BMIs are commonly brushed aside, minimizing the troublesome dimensions of the technology.[32]
Only recently having been implanted into humans, invasive brain-machine interfaces such as those created by Neuralink pose an abundance of ethical conundrums.[33] As previously noted, the very act of placing an invasive BMI into the human brain puts people at risk, potentially leading to severe injuries.[34] This means that the benefits of the technology must be carefully weighed against possible physical risks. It also means that there must be consensus about what the technology is intended for exactly, whether it is being used for restoration or for enhancement.
Beyond the initial insertion process, the interconnection between the brain and a BMI calls into question the autonomy of the individual housing the implant due to the powerful cognitive shaping it stimulates.[35] If a human’s neural pathways are being continuously shaped by the presence of the device, how can others be sure that the thoughts, feelings, and behaviors of said person are truly his or her own? While respect for autonomy is often a major consideration in the formulation of neurotechnologies like BMIs, there is simultaneously a weighty threat to a person’s autonomy as the device alters physiological mechanisms within the brain, rendering autonomy uncertain.[36] Not only are human social/relational engagements potentially compromised through these physical alterations, but informed consent is as well.[37]
The gravity of these risks requires considerable grappling with when considering the use of BMIs as a restorative tool. Given the assuring possibilities of BMI usage for individuals who lack the ability to communicate verbally or physically, there is a case to be made for permitting certain risks associated with the device. Subjecting individuals to these risks for the sake of enhancement, however, warrants censure. Matters such as brain-to-brain interactions and cognitive enhancement are ethically problematic to begin with, even before factoring in these physical risks.
The same technology that would allow for brain-to-brain interactions—what Musk so desperately yearns for—would be deleterious to individuality and authenticity.[38] The merging of human thought between individuals would threaten to eliminate dissent and uniqueness, both of which are fundamental to a progressing society. Having access to another person’s innermost thoughts erodes the fundamental privacy humans have possessed since creation.[39] Neil Messer touches on this concern in his evaluation of brain reading, or the technique of using “electro-encephalography (EEG) or functional magnetic resonance imaging (fMRI) to gain knowledge of subjects’ mental states or thoughts.”[40] Not only does this technique reduce the human mind to mere biological mechanisms, but it is prone to errors and does not account for differences between and within individuals’ brain activity.[41] Applications of brain reading are remarkable: It can be used in conjunction with brain-computer interfaces and to detect consciousness in individuals diagnosed with Unresponsive Wakefulness Syndrome (UWS), in addition to neuromarketing strategies and lie detection.[42] Though Messer addresses brain reading primarily in the context of the latter two situations, his argument regarding privacy holds true in the context of brain-to-brain interactions, such as those envisioned by Musk. Under the assumption that we all have a moral right to privacy, Messer, quoting Mark Tunick, concludes that losing our innermost form of privacy opens the door for manipulation and indignity as a result of “‘being exposed or accessed by others without one’s consent.’”[43] Moreover, the popular use of BMIs capable of brain-to-brain interaction would place humans on a trajectory toward a mental monoculture devoid of cognitive diversity and filled with fearful vulnerability.[44] Scientific progress would come at the cost of privacy and individuality, essential aspects of our humanity. In this way, the application of BMIs for brain-to-brain interactions suppresses human flourishing not only by deteriorating virtue but by cultivating vice.
Another contentious aspect of BMIs is the prospect of cognitive enhancement.[45] The improvement of memory, attention, and mood beyond the realm of “normal” functioning is a thrilling prospect to those who see the physical human body as an obstacle to be conquered rather than a form of embodiment for the intrinsically valuable human being.[46] The quest for excellence can be seen as characteristic of our human nature.[47] Nevertheless, the definition of excellence itself and the means of reaching it are subject to questioning. How we determine excellence is contextually dependent; what could be considered excellent in one situation may not be in another. When these ever-changing criteria become the standard of living, those who fall outside the standard may face ridicule, as evidenced in discriminatory eugenic practices spanning over a century.[48] Furthermore, if the means to achieve this precarious definition of excellence require the use of bioenhancement, we have erroneously outsourced the matter to the realm of biotechnology.[49] Technologically altering human nature is not the solution to defining or cultivating a cognitively and morally “excellent” society. Rather, it disallows the conditions necessary to develop the very virtues that allow us to thrive.
Here, we can turn to a Christian ethic to help navigate these troubling quandaries. Keeping in mind the risks brain-machine interfaces pose, there arises immediately the matter of disregarding the welfare of one’s neighbor in favor of technological innovation (Mark 12:31). Allowing the lure of progress to obfuscate or supersede the good of the person, even those whom the technology is being created to help, inevitably leads to the abandonment of ethical standards that are integral to the cultivation of human flourishing in a biotech world. It is true that technologies developed with the goal of neurorehabilitation have proven to be more effective than standard care in terms of patient outcomes; however, the eagerness of developers to disperse these technologies appears to be outpacing the ethical and legal valuations necessary to ensure they are not at risk of compromising patients’ rights.[50] Particularly in cases of invasive BMIs, which require an elaborate interconnection between a computer and a human brain, there must be more scrupulous ethical consideration, not less.
Being called to love one’s neighbor as oneself compels us to consider the benefit and detriment posed by advanced technologies such as BMIs. Letting excessive technological wariness close off a potential route to improved flourishing could be said to be just as mistaken as abandoning ethical evaluations of technologies that carry such a great risk of harm physically, cognitively, and even existentially. In cases of severe disability, such as locked-in syndrome, the risk of surgical implantation may indeed be acceptable should it provide the possibility of increased engagement.[51] The improved communication achieved through this means aligns with our God-given design to be in community with one another, consequently contributing to greater flourishing (Gen 2:18).[52] At the same time, its application may prove unacceptable if shown to perpetuate societal inequities significantly by limiting access, or to threaten the privacy of the individual reliant on the device, particularly in cases where ethical assessment prior to and in conjunction with implementation has proven inadequate. Christians should consequently work to understand comprehensively the nature of emerging technologies such as BMIs so as to recognize both their potential role in serving the needs of neighbors as well as their unnecessary risk, altogether avoiding advocating for technologies that disproportionately threaten an individual’s wellbeing or the wellbeing of society at large. It is then crucial for Christians to be prepared to help bear the suffering of individuals who may feel abandoned when promising technology proves ineffective, undesirable, or dangerous and come alongside them in Christ-like community regardless of their physical, mental, or spiritual state (1 Thes 5:14; Gal 6:2).
Infatuation with technological advancement at the expense of sound ethics not only results in the neglect of our neighbors’ wellbeing but of our role as stewards of that which God has given us (Gen 2:15; Deut 8:17–18; John 3:27).[53] The choices we make regarding how to create and implement technologies must be “informed by our values,” not the other way around.[54] Ienca et al. suggest that “ethical values should be proactively incorporated at the level of design” rather than being disregarded until the technology is nearly complete, then used in ongoing evaluation while the technology is in use.[55] Without a Christian moral framework to guide us, we will inevitably become absorbed in earthly endeavors that fail to glorify God. Consequently, instead of stewarding what we have created, we risk wrongfully worshiping it and abusing it, a scenario seen clearly in Genesis 11 through the construction of the Tower of Babel.[56] Human creativity is not in itself wicked; in fact, our creativity resembles the creative nature of God himself (Gen 1:27).[57] This passage in Genesis warns of the hubris of humans who worshiped their own creation instead of their Creator, ultimately leading to their downfall. Humans have natural limitations that prevent us from ever reaching a state of equality with God.[58] Our sinfulness presupposes that attempting to remove power over nature from God’s hands and ascribe it to humans inevitably results in disaster.[59] This in mind, we must not let the thrill of human capabilities diminish our recognition of God’s sovereignty or make us lose sight of our obligation to honor him through the creativity he has gifted us with (Prov 3:9).
Regarding the prospect of brain-to-brain interactions embraced by Elon Musk, it is crucial to remember that only God can fully know our minds (Ps 139:23–24; 1 Cor 13:12). Indeed, this is partly what makes the gift of salvation so miraculous (Rom 5:8). To regard the knowledge we could have of other persons through brain-to-brain interactions facilitated by BMIs as “God-like” or on par with the way in which God knows us is unarguably mistaken.[60] Endeavoring to invade the privacy of and alter the human mind through means of BMIs not only reduces humans to our biological mechanisms but attempts out of hubris to elevate human capacities. In fact, any knowledge we might obtain about somebody through this technology would be merely a distortion of how we ought to know and be known.[61] In our fallen world, we may very well be persuaded to inflate, diminish, or conceal aspects of ourselves if our mental privacy were stripped; similarly, we might be tempted to do the same to others depending on how it best served the matter at hand, altering the moral foundation of society.[62] We would appear to be uncomfortably close to a posture of divine judgement, albeit without the incomprehensible mercy we all desperately need (Rom 6:23; 1 Cor 4:5).[63] God knows us fully; he knows us outside of the limits of time and in the context of his salvation and justification, enabling him to judge us righteously (Gal 3:24).[64] Even the most sophisticated BMI lacks the capacity to rival the all-good omnipotence and omniscience of God. The reality of human depravity means that “biomedical enhancements, being part of culture, work to transmit the destructive effects of sin and have the potential to magnify human depravity.”[65] As such, striving for a technology that seeks to imitate God in this way remains morally problematic; and, in the case that this kind of technology is normalized, holding to these truths of Scripture will be all the more important.
Furthermore, Scripture makes abundantly clear that God created mankind in his image (Gen 1:26, 9:6; Col 3:10). The cognitive enhancement of human beings directly flies in the face of a sound theological anthropology. Champions of enhancement are eager to transcend the supposedly inconvenient limits of the human body, but our bodies have been declared good by God and are therefore not meant to be viewed as obstacles to a better life but as living sacrifices through which we serve God and others as exemplified in the life, death, and resurrection of Jesus incarnate (Gen 1:31; Rom 12:1; 1 Tim 3:16). Intervening in already healthy persons in order to cognitively enhance the mind denies the goodness of God’s creation while simultaneously disregarding the inherent value of human beings regardless of ability (Matt 10:29–31).[66] Cognitive enhancement through BMIs promotes a transhumanist agenda that challenges the fundamentally sacred design of human beings and opts instead for an artificially manufactured society deprived of authenticity and diversity. This puts those who fall outside of the margins of the new “normal” in peril of being seen as lesser humans, or at worst, not human at all. Enhancements of the human mind are not a pathway to better humans but one leading to dehumanization.[67]
In summary, brain-machine interfaces are still an emerging technology that is subject to great praise and in need of greater scrutiny. While there seem to be genuine benefits to those with disabilities, there remain significant ethical predicaments that need to be grappled with seriously and intentionally: physical risk at the point of installation, uncertainty regarding privacy and autonomy, the dangers of cognitive enhancement, and struggles to carry out the Christian calling to love one’s neighbor, to steward well the creativity we have, to avoid unrighteous judgement and distrust, and to preserve the dignity of all humans created in God’s image. Is the answer, then, to abandon the technology of BMIs altogether? Fabrice Jotterand and James Giordano would say “no”; rather, more research must be conducted to assess appropriately the ethical applications of the technology.[68] There may indeed be a place for BMIs to assist in flourishing as restorative interventions if proper ethical boundaries are established. To do so, it is imperative a biblical ethical framework be consulted to determine how God intends humans to flourish. BMIs are just one of many technologies that, if developed and applied ethically, offer tremendous hope to persons facing suffering in our fallen world. Until BMIs are better developed and regulated, however, they will continue to be a morally precarious technology laden with unavoidable harms that Christians in particular should remain wary of, making certain to maintain a proper understanding of God’s design for humanity so as not to lose sight of our calling to love the Lord with all of our heart, soul, and mind, and to love our neighbor as ourself, even in a biotech world (Matt 22:37–39).
[1] Adrien B. Rapeaux and Timothy G. Constandinou, “Implantable Brain Machine Interfaces: First-in-Human Studies, Technology Challenges and Trends,” Current Opinion in Biotechnology 72 (2021): 102, https://doi.org/10.1016/j.copbio.2021.10.001.
[2] Neil Savage, “Brain Implants Give People Back What They Lost,” Communications of the ACM 68, no. 2 (2025): 17–19, https://doi.org/10.1145/3701222.
[3] J. J. Vidal, “Toward Direct Brain-Computer Communication,” Annual Review of Biophysics 2, (1973): 157–80, https://doi.org/10.1146/annurev.bb.02.060173.001105.
[4] Mikhail A. Lebedev and Miguel A. L. Nicolelis, “Brain-Machine Interfaces: From Basic Science to Neuroprostheses and Neurorehabilitation,” Physiological Reviews 97, no. 2 (2017): 769, https://doi.org/10.1152/physrev.00027.2016.
[5] Lebedev and Nicolelis, “Brain-Machine Interfaces,” 769.
[6] Ankur Gupta, Nikolaos Vardalakis, and Fabien B. Wagner, “Neuroprosthetics: From Sensorimotor to Cognitive Disorders,” Communications Biology 6, no. 1 (2023): 14, https://doi.org/10.1038/s42003-022-04390-w; Baraka Maiseli, Abdi T. Abdalla, Libe V. Massawe, Mercy Mbise, Khadija Mkocha, Nassor Ally Nassor et al., “Brain-Computer Interface: Trend, Challenges, and Threats,” Brain Informatics 10, no. 1 (2023): 2, https://doi.org/10.1186/s40708-023-00199-3.
[7] Robert Yawn, Jacob B. Hunter, Alex D. Sweeney, and Marc L. Bennett, “Cochlear Implantation: A Biomechanical Prosthesis for Hearing Loss,” F1000Prime Reports 7 (2015): 1–2, https://doi.org/10.12703/P7-45.
[8] Yawn et al, “Cochlear Implantation,” 2.
[9] Erika J. Davidoff, “Agency and Accountability: Ethical Considerations for Brain-Computer Interfaces,” The Rutgers Journal of Bioethics 11 (2020): 9–20.
[10] Davidoff, “Agency and Accountability,” 1; Maiseli et al., “Brain-Computer Interface,” 1; Lebedev and Nicolelis, “Brain-Machine Interfaces,” 770.
[11] Davidoff, “Agency and Accountability,” 1; Lebedev and Nicolelis, “Brain-Machine Interfaces,” 770.
[12] Efstratios Livanis, Polychronis Voultsos, Konstantinos Vadikolias, Panagiotis Pantazakos, and Alexandra Tsaroucha, “Understanding the Ethical Issues of Brain-Computer Interfaces (BCIs): A Blessing or the Beginning of a Dystopian Future?” Cureus 16, no. 4 (2024): 3, https://doi.org/10.7759/cureus.58243; Maiseli et al., “Brain-Computer Interface,” 2.
[13] Livanis et al., “Understanding the Ethical Issues,” 3; Elon Musk, “An Integrated Brain-Machine Interface Platform with Thousands of Channels,” Journal of Medical Internet Research 21, no. 10 (2019): e16194, https://doi.org/10.2196/16194; “Neuralink—Pioneering Brain Computer Interfaces,” Neuralink, accessed February 13, 2025, https://neuralink.com/.
[14] Emily Mullin, “Elon Musk’s Neuralink Had a Brain Implant Setback. It May Come Down to Design,” Wired, May 9, 2024, https://www.wired.com/story/neuralinks-brain-implant-issues/.
[15] Lebedev and Nicolelis, “Brain-Machine Interfaces,” 801.
[16] Suparna Choudhury and Kelly A. McKinney, “Digital Media, the Developing Brain and the Interpretive Plasticity of Neuroplasticity,” Transcultural Psychiatry 50, no. 2 (2013): 196, https://doi.org/10.1177/1363461512474623.
[17] Danylo F. Cabral, Peter Fried, Sebastian Koch, Jordyn Rice, Tatjana Rundek, Alvaro Pascual-Leone et al., “Efficacy of Mechanisms of Neuroplasticity After a Stroke,” Restorative Neurology and Neuroscience 40, no. 2 (2022): 73–84, https://doi.org/10.3233/RNN-211227.
[18] Choudhury and McKinney, “Digital Media,” 193.
[19] Martin Korte, “The Impact of the Digital Revolution on Human Brain and Behavior: Where Do We Stand?,” Dialogues in Clinical Neuroscience 22, no. 2 (2020): 107, https://doi.org/10.31887/DCNS.2020.22.2/mkorte; Jeremiah D. Wander, Timothy Blakely, Kai J. Miller, Kurt E. Weaver, Lise A. Johnson, Jared D. Olson et al., “Distributed Cortical Adaptation During Learning of a Brain–Computer Interface Task,” Proceedings of the National Academy of Sciences 110, no. 26 (2013): 10821, https://doi.org/10.1073/pnas.1221127110.
[20] Lebedev and Nicolelis, “Brain-Machine Interfaces,” 771.
[21] Wander et al., “Distributed Cortical Adaptation,” 10821.
[22] Lebedev and Nicolelis, “Brain-Machine Interfaces,” 801.
[23] Lebedev and Nicolelis, “Brain-Machine Interfaces,” 801.
[24] Neuralink, “Pioneering Brain Computer Interfaces.”
[25] Musk, “An Integrated Brain-Machine Interface,” 1.
[26] “Elon Musk’s Neuralink Implants Brain Chip in First Human,” Reuters, January 30, 2024, https://www.reuters.com/technology/neuralink-implants-brain-chip-first-human-musk-says-2024-01-29/; “Chinese Brain Implant Reaches Landmark Clinical Trial with Operation on Amputee,” South China Morning Post, June 14, 2025, https://www.scmp.com/news/china/science/article/3314461/chinese-brain-implant-reaches-landmark-clinical-trial-operation-amputee; Bojan Stojkovski, “China Tests Neural Implant That Lets Amputee to Move Cursor with Mind,” Interesting Engineering, June 15, 2025, https://interestingengineering.com/science/china-tests-neural-implant-amputee-move-cursor.
[27] Mullin, “Elon Musk’s Neuralink.”
[28] Mike Allen and Jim VandeHei, “Elon Musk: Humans Must Merge with Machines,” Axios, November 26, 2018, https://www.axios.com/2018/11/26/elon-musk-artificial-intelligence-neuralink.
[29] Allen and VandeHei, “Elon Musk.”
[30] “Elon Musk: If Humans Are to Survive, We Must Merge with Machines,” Futurism, November 16, 2017, https://futurism.com/elon-musks-and-the-need-for-symbiosis-with-machines.
[31] William P. Cheshire, Jr., “Thinking About Neuralinking: Christian Considerations Before Getting a Brain Chip” (presented at Formed 2024: Technology and the Christian, Chattanooga, TN, 2024).
[32] Livanis et al., “Understanding the Ethical Issues,” 5.
[33] Livanis et al,. “Understanding the Ethical Issues,” 5.
[34] Livanis et al., “Understanding the Ethical Issues,” 3.
[35] Livanis et al., “Understanding the Ethical Issues,” 4.
[36] Marcello Ienca, Reto W. Kressig, Fabrice Jotterand, and Bernice Elger, “Proactive Ethical Design for Neuroengineering, Assistive and Rehabilitation Technologies: The Cybathlon Lesson,” Journal of NeuroEngineering and Rehabilitation 14, no. 1 (2017): 2, https://doi.org/10.1186/s12984-017-0325-z.
[37] Livanis et al., “Understanding the Ethical Issues,” 4.
[38] Emma C. Gordon and Anil K. Seth, “Ethical Considerations for the Use of Brain-Computer Interfaces for Cognitive Enhancement.,” PLoS Biology 22, no. 10 (2024): 5, 9, https://doi.org/10.1371/journal.pbio.3002899; VandeHei, “Elon Musk.”
[39] Gordon and Seth, “Ethical Considerations,” 7.
[40] Neil Messer, “Judging the Secret Thoughts of All: Functional Neuroimaging, ‘Brain Reading’, and the Theological Ethics of Privacy,” Studies in Christian Ethics 34, no. 1 (2021): 18, https://doi.org/10.1177/0953946820910328.
[41] Messer, “Judging the Secret,” 19.
[42] Messer, “Judging the Secret,” 20.
[43] Messer, “Judging the Secret,” 23.
[44] Gordon and Seth, “Ethical Considerations,” 8.
[45] Gordon and Seth, “Ethical Considerations,” 5; Livanis et al., “Understanding the Ethical Issues,” 5.
[46] Livanis et al., “Understanding the Ethical Issues,” 5.
[47] Beyond Therapy: Biotechnology and the Pursuit of Happiness (The President’s Council on Bioethics, 2003), “Superior Performance,” https://bioethicsarchive.georgetown.edu/pcbe/reports/beyondtherapy/chapter3.html.
[48] Mannie Liscum and Michael L. Garcia, “You Can’t Keep a Bad Idea Down: Dark History, Death, and Potential Rebirth of Eugenics,” The Anatomical Record 305, no. 4 (2022): 902–37, https://doi.org/10.1002/ar.24849.
[49] C. Ben Mitchell, Edmund D. Pellegrino, Jean Bethke Elshtain, John F. Kilner, and Scott Rae, Biotechnology and the Human Good (Georgetown University Press, 2007), 129.
[50] Ienca et al., “Proactive Ethical Design,” 2.
[51] William P. Cheshire, Jr., “Machine Intelligence as Interpreter: Ethical Implications of Neural Speech Decoding,” Ethics & Medicine 35, no. 2 (2019): 71–78.
[52] Mitchell et al., Biotechnology and the Human Good, 108.
[53] Mitchell et al., Biotechnology and the Human Good, 23.
[54] Mitchell et al., Biotechnology and the Human Good, 24.
[55] Marcello Ienca, Tenzin Wangmo, Fabrice Jotterand, Reto W. Kressig, and Bernice Elger “Ethical Design of Intelligent Assistive Technologies for Dementia: A Descriptive Review,” Science and Engineering Ethics 24, no. 4 (2018): 1039, https://doi.org/10.1007/s11948-017-9976-1.
[56] Mitchell et al., Biotechnology and the Human Good, 25.
[57] Mitchell et al., Biotechnology and the Human Good, 28.
[58] Mitchell et al., Biotechnology and the Human Good, 89.
[59] Mitchell et al., Biotechnology and the Human Good, 98.
[60] Messer, “Judging the Secret,” 29.
[61] Messer, “Judging the Secret,” 29.
[62] Messer, “Judging the Secret,” 34.
[63] Messer, “Judging the Secret,” 24.
[64] Messer, “Judging the Secret,” 28.
[65] William P. Cheshire, Jr., “Beyond Humanity: Theological and Biotechnological Perspectives on Enhancement in Dialogue,” Ethics & Medicine 31, no. 2 (2015): 77.
[66] Beyond Therapy, “Happy Souls,” https://bioethicsarchive.georgetown.edu/pcbe/reports/beyondtherapy/chapter5.html.
[67] Mitchell et al., Biotechnology and the Human Good, 127.
[68] Fabrice Jotterand and James Giordano, “Transcranial Magnetic Stimulation, Deep Brain Stimulation and Personal Identity: Ethical Questions, and Neuroethical Approaches for Medical Practice,” International Review of Psychiatry 23, no. 5 (2011): 481, https://doi.org/10.3109/09540261.2011.616189.
Laura A. Cheshire, "Neuroengineering Hope and Harm: Ethical Dilemmas of Brain-Machine Interfaces," Dignitas 32, no. 3–4 (2025): 3–7, www.cbhd.org/dignitas-articles/neuroengineering-hope-and-harm-ethical-dilemmas-of-brain-machine-interfaces.