Artificial Intelligence, Emerging Technology
October 22, 2020
AI, Neuralink, and the Evolution of Human-Machine Interfaces
By: Seth Brooks
Fast forward to more recent times where people use more sophisticated tools in the form of computer systems. Even these modern digital tools required people to manipulate them though physical inputs (mice and keyboards) and observe the feedback through an externally observable output (monitors). This paradigm of the directly physical Human-Machine interfaces began to change in the past few decades with the rise of Artificial Intelligence (AI) capabilities.
AI has introduced an era where computers are able to not only work through direct physical manipulation, but where the tools themselves are able to observe and make predictions about a user’s intent. For example, computer vision systems record peoples’ natural kinesthetic body movements and gestures, use AI to interpret those gestures into requested actions (input) and display the results on a screen (for example scrolling through a menu). These systems still rely on some physical input from users, although there is not the same direct physical connection. Digital assistants such as Apple’s Siri, Amazon’s Alexa or Google’s Home similarly require a physical input, although a very light touch in the form of a breath of air for a voice command.
On August 28, 2020 Elon Musk and the Neuralink team presented a live demonstration of Neuralink’s latest experimental technology – a device which can be implanted in a skull to both read signals as well as introduce electrical impulses into the outer cortex of a brain. In the presentation, Neuralink showed a video of a pig with a Neuralink implant walking on a treadmill. Predictive algorithms attempted to determine the test-subject pig’s position and motion with surprising accuracy. Although early in development, this technology offers a view into a possible future where the tools of our digital world no longer require physical manipulation for an input or an externally observable output. It is an interesting world to imagine in my mind’s eye – even as I type this observation out on my physical keyboard.
Seth Brooks is a Vice President in FCAT.References & Disclaimers
1 See Neuralink Progress Update, Summer 2020, available at https://www.youtube.com/watch?v=DVvmgjBL74w
948978.1.0
Related posts
Gender Equality Challenges in the Wake of the Pandemic
Sarah Hoffman
January 12, 2021
Before the novel coronavirus swept through the world, things were looking bright for women in the US workplace. For the first time in almost a decade, women made up the majority of the workforce; 42% of businesses were women-owned; women with...
Technology & Society, Artificial Intelligence
2022: The Year of the AI Image Generator
Sarah Hoffman
November 22, 2022
While 2021 was the year of monster AI language models and AI text generation, thanks to powerful language models like OpenAI’s GPT-3,1 it seems that 2022 is the year of text-to-image AI systems. We’ve seen AI art before – just last year Sophia the...
Technology & Society, Emerging Technology
How COVID-19 Changes the Future of Innovation
Sophia Mowlanejad
September 10, 2020
Sadly, it’s too easy to tally up the losses from COVID-19 – as of the end of August, more than 700,000 lives, millions of jobs, more than 100,000 small businesses in the U.S., and seven to ten months of learning for students. COVID-19 has also...