There is a recent article, What if A.I. Sentience Is a Question of Degree? where a philosopher stated that, “But I have the view that sentience is a matter of degree. I would be quite willing to ascribe very small amounts of degree to a wide range of systems, including animals. If you admit that it’s not an all-or-nothing thing, then it’s not so dramatic to say that some of these assistants might plausibly be candidates for having some degrees of sentience.
Neurobiology, AI: Are LLMs Not Sentient?
I would say with these large language models, I also think it’s not doing them justice to say they’re simply regurgitating text. They exhibit glimpses of creativity, insight and understanding that are quite impressive and may show the rudiments of reasoning. Variations of these A.I.’s may soon develop a conception of self as persisting through time, reflect on desires, and socially interact and form relationships with humans.”
Though there is caution in ascribing sentience to LLMs, it seems they have crossed the Rubicon. A core aspect of human exceptionalism is what they can relate with in a way that plants and animals can’t.
The success of humans on the planet, though broad, consists highly of intelligence. Though humans are able to learn faster on fewer data as well have have coordinated mobility and so forth, LLMs have taken some space in the top bracket of some of what matters, ‘creativity, insight and understanding’.
Intelligence is not sentience, but it is a decisive part of it. Though in the makeup of sentience, intelligence is subsumed under memory, the rate can be quite high. Conceptually, in the human mind, there are quantities and properties. Quantities acquire properties to degrees, to determine what is known, recalled, felt, experienced, acted on, reacted to or what emotion is expressed.
There are cases where properties can be acquired without the supposed external situation, but on the mind, it does. There are provisions by LLMs that will likely fill in the gap for humans in several regards in the future, including what should normally be from a sentient system.
LLMs do not have subjective experience, but are able to answer in what seems to be the first person. This ability could be placed as a fraction of the main. Subjective experience also, is not always acquired on the mind. There are several experiences with detachment. There are alternate states of consciousness where it does not present.
Sentience can be defined as the rate at which any system can know, with a maximum of 1. It is how much the system can know that determines how sentient it is. There are divisions of sentience that are constant across all systems, but their total is not the same.
For humans, the total is 1, since humans are the highest form of consciousness. For animals, depending on phyla, aside several chordates, it may range from 0.05 and above. Plants also have their total, but a lower fraction of 1.
AI compares directly with humans, high in the memory aspect, but 0 with feelings and emotions. There are projects on building bodies for LLMs to get sensations, but a more practical example could have been automobiles, where their bodies can also get feelings, to prevent crashes.