GC: n
CT: Technology that aims to produce computational models of the brain and its activities is represented by artificial brains. These models can take many different forms, from straightforward simulations of certain brain regions to intricate neural networks meant to simulate the behavior of the entire brain. Artificial brains have a wide range of possible uses, from enhancing our comprehension of the brain and brain illnesses to creating new types of robotics and artificial intelligence. But this technology also has important ethical and cultural ramifications, including concerns about privacy, autonomy, and the possibility of abuse. The objective of this study is to investigate the current status of artificial brain technology, its possible advantages and disadvantages, and the ethical issues that need to be considered when developing and implementing this technology.
S: SSRN – https://papers.ssrn.com/sol3/papers.cfm?abstract_id=4492387 (last access: 1 April 2025)
N: 1. artificial (adj) late 14c., in the phrase artificial day “part of the day from sunrise to sunset,” from Old French artificial, from Latin artificialis “of or belonging to art,” from artificium. Meaning “made by man” (opposite of natural) is from early 15c. Applied to things that are not natural, whether real (artificial light) or not (artificial flowers). Artificial insemination dates from 1897. Artificial intelligence “the science and engineering of making intelligent machines” was coined in 1956.
brain (n): Old English brægen “brain,” from Proto-Germanic bragnam (cognates: Middle Low German bregen, Old Frisian and Dutch brein), from PIE root mregh-m(n)o- “skull, brain” (cognates: Greek brekhmos “front part of the skull, top of the head”). But Liberman writes that brain “has no established cognates outside West Germanic …” and is not connected to the Greek word. More probably, he writes, its etymon is PIE bhragno “something broken.”
2. Artificial brains are man-made machines that are just as intelligent, creative and self-aware as humans. No such machine has yet been built, but it is only a matter of time. Given current trends in neuroscience, computing and nanotechnology, we estimate that artificial general intelligence will emerge sometime in the 21st century, maybe even by the year 2050.
3. Cultural Interrelation:
- Fiction: Arnold Schwarzenegger’s character in the Terminator movies is equipped with a processor-based artificial brain that enables him to think, react and learn. Much as we see in Isaac Asimov famous yet fictitious positronic brain, Arnie is able to do what he does because there is a melding between hardware and software in his central processing core – a melding that depends on intricate complexity in three dimensions.
- Reality: SyNAPSE is a DARPA-funded program to develop neuromorphic microprocessor systems that match the intelligence, physical size, and low power consumption of animal brains. SpiNNaker is a massively-parallel neuromorphic computing architecture designed to model large, biologically plausible, spiking neural networks. BrainScaleS project, neuromorphic hardware is based on wafer-scale analog VLSI. Each wafer implements ~200,000 spiking neurons and 49 million synapses.
S: 1. Etymonline – http://www.etymonline.com/index.php?allowed_in_frame=0&search=artificial+brain&searchmode=none (last access: 4 November 2014). 2. ARTI – http://www.artificialbrains.com/ (last access: 4 November 2014). 3. Risksc – http://www.riskscience.umich.edu/could-we-one-day-3d-print-arnold-schwarzeneggers-brain/ (last access: 31 March 2015); Artbr – http://www.artificialbrains.com/ (last access: 31 March 2015).
SYN:
S:
CR: artificial intelligence, cerebrum, cognition, cognitive science.