Tag Archives: cognition

A dive into the black waters under the surface of persuasive design

A Guardian article last October brings the darker aspects of the attention economy, particularly the techniques and tools of neural hijacking, into sharp focus. The piece summarizes some interaction design principles and trends that signal a fundamental shift in means, deployment, and startling effectiveness of mass persuasion. The mechanisms reliably and efficiently leverage neural reward (dopamine) circuits to seize, hold, and direct attention toward whatever end the designer and content providers choose.

The organizer of a $1,700 per person event convened to show marketers and technicians “how to manipulate people into habitual use of their products,” put it baldly.

subtle psychological tricks … can be used to make people develop habits, such as varying the rewards people receive to create “a craving”, or exploiting negative emotions that can act as “triggers”. “Feelings of boredom, loneliness, frustration, confusion and indecisiveness often instigate a slight pain or irritation and prompt an almost instantaneous and often mindless action to quell the negative sensation”

Particularly telling of the growing ethical worry are the defections from social media among Silicon Valley insiders.

Pearlman, then a product manager at Facebook and on the team that created the Facebook “like”,  … confirmed via email that she, too, has grown disaffected with Facebook “likes” and other addictive feedback loops. She has installed a web browser plug-in to eradicate her Facebook news feed, and hired a social media manager to monitor her Facebook page so that she doesn’t have to.
It is revealing that many of these younger technologists are weaning themselves off their own products, sending their children to elite Silicon Valley schools where iPhones, iPads and even laptops are banned. They appear to be abiding by a Biggie Smalls lyric from their own youth about the perils of dealing crack cocaine: never get high on your own supply.

If you read the article, please comment on any future meeting topics you detect. I find it a vibrant collection of concepts for further exploration.

Check out Ed Berge’s blog

We’ve come to appreciate Ed Berge’s thoughtful posts on consciousness, metaphorical thinking, etc. Check out his fun, informative blog, Proactive Progressive Propagation. (Where I work, that would definitely become ‘P3.’)

Book review – Life 3.0: Being Human in the Age of Artificial Intelligence, by Max Tegmark

Max Tegmark’s new book, Life 3.0: Being Human in the Age of Artificial Intelligence, introduces a framework for defining types of life based on the degree of design control that sensing, self-replicating entities have over their own ‘hardware’ (physical forms) and ‘software’ (“all the algorithms and knowledge that you use to process the information from your senses and decide what to do”).

It’s a relatively non-academic read and well worth the effort for anyone interested in the potential to design the next major forms of ‘Life’ to transcend many of the physical and cognitive constraints that have us now on the brink of self-destruction. Tegmark’s forecast is optimistic.

Promising metabolic therapy for dementias

An article describes a personalized therapeutic program involving 10 patients and using multiple modalities for metabolic enhancement for neurodegeneration (MEND).

The first 10 patients who have utilized this program include patients with memory loss associated with Alzheimer’s disease (AD), amnestic mild cognitive impairment (aMCI), or subjective cognitive impairment (SCI). Nine of the 10 displayed subjective or objective improvement in cognition beginning within 3-6 months, with the one failure being a patient with very late stage AD. Six of the patients had had to discontinue working or were struggling with their jobs at the time of presentation, and all were able to return to work or continue working with improved performance. Improvements have been sustained, and at this time the longest patient follow-up is two and one-half years from initial treatment, with sustained and marked improvement. These results suggest that a larger, more extensive trial of this therapeutic program is warranted. The results also suggest that, at least early in the course, cognitive decline may be driven in large part by metabolic processes. Furthermore, given the failure of monotherapeutics in AD to date, the results raise the possibility that such a therapeutic system may be useful as a platform on which drugs that would fail as monotherapeutics may succeed as key components of a therapeutic system.

https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4221920/

Cognitive decline as early as 18 years prior to clinical diagnosis of dementia

Performance on individual cognitive tests of episodic memory, executive function, and global cognition also significantly predicted the development of AD dementia, with associations exhibiting a similar trend over 18 years.

Conclusions: Our findings suggest that cognitive impairment may manifest in the preclinical phase of AD dementia substantially earlier than previously established.

Neuroscience of Empathy

(This is copied from the Meetup site. Thanks again to Brent for hosting.)

Details

Empathy is the ability to put yourself in another person’s shoes and understand how they feel- to be them, even for a second. It’s the link between self and others: how we connect, heal, and relate. Considering its importance in every aspect of our lives, we are taking a deeper look at the neuroscience behind empathy.

Recommended Preparation Info.

The Neuroscience of Empathy | Article | 5 minutes (https://www.psychologytoday.com/blog/the-athletes-way/201310/the-neuroscience-empathy)

The Neuroscience of Compassion | Video | 20 min (https://youtu.be/n-hKS4rucTY)

Jeremy Rifkin: The empathic civilization | Video | 10 min (https://www.ted.com/talks/jeremy_rifkin_on_the_empathic_civilization)

A CALM LOOK AT THE MOST HYPED CONCEPT IN NEUROSCIENCE – MIRROR NEURONS | Article | 5 min (https://www.wired.com/2013/12/a-calm-look-at-the-most-hyped-concept-in-neuroscience-mirror-neurons/)

Empathy for others’ pain rooted in cognition rather than sensation | Article | 5 min (https://www.sciencedaily.com/releases/2016/06/160614100237.htm)

Thomas Lewis: “The Neuroscience of Empathy” | Video | 60 min (https://youtu.be/1-T2GsG0l1E)

Suggested Additional Info.

Feeling Others’ Pain: Transforming Empathy into Compassion | Article | 5 min (https://www.cogneurosociety.org/empathy_pain/)

Structural basis of empathy and the domain general region in the anterior insular cortex | Study | 20 min (http://journal.frontiersin.org/article/10.3389/fnhum.2013.00177/full)

Neurobiology of Empathy and Callousness: Implications for the Development of Antisocial Behavior | Study | 20 min (https://www.ncbi.nlm.nih.gov/pmc/articles/PMC2729461/)

The Science Behind Empathy and Empaths | Article | 5 min (https://www.psychologytoday.com/blog/the-empaths-survival-guide/201703/the-science-behind-empathy-and-empaths)

Study challenges perception that empathy erodes during medical school | Article | 5 min (https://www.sciencedaily.com/releases/2017/09/170909194039.htm)

Comments

  • Mark Harris

    Rifkin’s book, The Empathic Civilization, is excellent.

    29 days ago
  • John

    Here is a link to an excellent article arguing against a myopic focus on empathy.
    http://bostonreview.net/forum/paul-bloom-against-empathy

    23 days ago
  • John

    Here is a link to a free ebook that is entitled Compassion: Bridging Science and Practice. The book is the culmination of research findings in social neuroscience studies conducted by Tania Singer and others. There are multiple formats for download.
    http://www.compassion-training.org/?page=download&lang=en

    23 days ago
  • John

    Here is a link to an article about Tania Singer’s research in Science Magazine.
    http://flourishfoundation.org/wp-content/uploads/2014/04/Compassioan-Science-2013.pdf

    23 days ago
  • Edward

    From the link: “Patterns associated with empathic care, for instance, overlapped with systems in the brain associated with value and reward, such as the ventromedial prefrontal cortex and the medial orbitofrontal cortex. In contrast, patterns of empathic distress overlapped with systems in the brain known for mirroring, such as the premotor cortex and the primary and secondary somatosensory cortices, which help an individual simulate or imagine what another person is feeling or thinking.”

    23 days ago
  • Edward

    Here’s another one I just read: “Brain imaging reveals neural roots of caring. http://neurosciencenews.com/caring-neural-roots-6870/

    23 days ago
  • Edward

    From the conclusion: “Shared representations of affective states are activated from the top down in more  cognitive forms of empathy, which recruit additional executive and visuospatial processes. However, the literature overestimates distinctions between emotional and cognitive empathy, following traditional practices to dichotomize in science and philosophy. Despite each
    having unique features, affective and cognitive empathy both require access to the shared representations of emotion that provide simulations with content and an
    embodied meaning.”

    23 days ago
  • Edward

    The entire article can be read here: https://sci-hub.cc/10.1038/nrn.2017.72

    23 days ago
  • Edward

    And this article. Abstract: “Recent research on empathy in humans and other mammals seeks to dissociate emotional and cognitive empathy. These forms, however, remain interconnected in evolution, across species and at the level of neural mechanisms. New data have facilitated the development of empathy models such as the perception–action model (PAM) and mirror-neuron theories. According to the PAM, the emotional states of others are understood through personal, embodied representations that allow empathy and accuracy to increase based on the observer’s past experiences. In this Review, we discuss the latest evidence from studies carried out across a wide range of species, including studies on yawn contagion, consolation, aid-giving and contagious physiological affect, and we summarize neuroscientific data on representations related to another’s state.” https://www.nature.com/nrn/journal/v18/n8/full/nrn.2017.72.html

    23 days ago
  • John

    Here is a link to an excellent video of 4 researchers giving talks at the Stanford CCARE conference. The video is 75 minutes.
    CCARE Science of Compassion 2014: Introduction to the Science of Empathy, Altruism, and Compassion
    https://youtu.be/YFDiQNwqbfw

    22 days ago
  • Edward

    Jimmy Kimmel in this video highlights a lot of what we talked about tonight. Yes, we need to feel empathy for those killed an injured in the Las Vegas shooting, but we also need to DO something about it. Meaning gun legislation. He highlights those in Congress who are making it easier instead of harder to obtain the kind of automatic weapons used in this mass murder. The reality is we must make such guns illegal, for it acts on our empathy and morality in a way that protects and serves us. https://www.youtube.com/watch?v=ruYeBXudsds

    21 days ago

State of AI progress

An MIT Technology Review article introduces the man responsible for the 30-year-old deep learning approach, explains what deep machine learning is, and questions whether deep learning may be the last significant innovation in the AI field. The article also touches on a potential way forward for developing AIs with qualities more analogous to the human brain’s functioning.

Computer metaphor not accurate for brain’s embodied cognition

It’s common for brain functions to be described in terms of digital computing, but this metaphor does not hold up in brain research. Unlike computers, in which hardware and software are separate, organic brains’ structures embody memories and brain functions. Form and function are entangled.

Rather than finding brains to work like computers, we are beginning to design computers–artificial intelligence systems–to work more like brains. 

https://www.wired.com/story/tech-metaphors-are-holding-back-brain-research/ 

Should AI agents’ voice interactions be more like our own? What effects should we anticipate?

An article at Wired.com considers the pros and cons of making the voice interactions of AI assistants more humanlike.

The assumption that more human-like speech from AIs is naturally better may prove as incorrect as the belief that the desktop metaphor was the best way to make humans more proficient in using computers. When designing the interfaces between humans and machines, should we minimize the demands placed on users to learn more about the system they’re interacting with? That seems to have been Alan Kay’s assumption when he designed the first desktop interface back in 1970.

Problems arise when the interaction metaphor diverges too far from the reality of how the underlying system is organized and works. In a personal example, someone dear to me grew up helping her mother–an office manager for several businesses. Dear one was thoroughly familiar with physical desktops, paper documents and forms, file folders, and filing cabinets. As I explained how to create, save, and retrieve information on a 1990 Mac, she quickly overcame her initial fear. “Oh, it’s just like in the real world!” (Chalk one for Alan Kay? Not so fast.) I knew better than to tell her the truth at that point. Dear one’s Mac honeymoon crashed a few days later when, to her horror and confusion, she discovered a file cabinet inside a folder. A few years later, there was another metaphor collapse when she clicked on a string of underlined text in a document and was forcibly and instantly transported to a strange destination.

Having come to terms with computers through the command-line interface, I found the desktop metaphor annoying and unnecessary. Hyperlinking, however–that’s another matter altogether–an innovation that multiplied the value I found in computing.

On the other end of the complexity spectrum would be machine-level code. There would be no general computing today if we all had to speak to computers in their own fundamental language of ones and zeros. That hasn’t stopped some hard-core computer geeks from advocating extreme positions on appropriate interaction modes, as reflected in this quote from a 1984 edition of InfoWorld:

“There isn’t any software! Only different internal states of hardware. It’s all hardware! It’s a shame programmers don’t grok that better.”

Interaction designers operate on the metaphor end of the spectrum by necessity. The human brain organizes concepts by semantic association. But sometimes a different metaphor makes all the difference. And sometimes, to be truly proficient when interacting with automation systems, we have to invest the effort to understand less simplistic metaphors.

The article referenced in the beginning of this post mentions that humans are manually coding “speech synthesis markup tags” to cause synthesized voices of AI systems to sound more natural. (Note that this creates an appearance that the AI understands the user’s intent and emotional state, though this more natural intelligence is illusory.) Intuitively, this sounds appropriate. The down side, as the article points out, is that colloquial AI speech limits human-machine interactions to the sort of vagueness inherent in informal speech. It also trains humans to be less articulate. The result may be interactions that fail to clearly communicate what either party actually means.

I suspect a colloquial mode could be more effective in certain kinds of interactions: when attempting to deceive a human into thinking she’s speaking with another human; virtual talk therapy; when translating from one language to another in situations where idioms, inflections, pauses, tonality, and other linguistic nuances affect meaning and emotion; etc.

In conclusion, operating systems, applications, and AIs are not humans. To improve our effectiveness in using more complex automation systems, we will have to meet them farther along the complexity continuum–still far from machine code, but at points of complexity that require much more of us as users.