Towards Ambient Computing
History of Eidetic Memory Devices
In 1945, American scientist Vannevar Bush developed a concept called the Memex in his essay As You May Think. The central idea was to store an individual's conversations, records, and communications in a compressed and mechanical form and then quickly retrieve them. The purpose was to use the Memex to extend and enhance memory. Vannevar Bush is often credited with having the first vision of the Internet, as closely connected and augmented memories can aid in people's thinking. It is worth noting that this idea was revolutionary, considering it was before the arrival of ENIAC.
A renowned science fiction writer, Ted Chiang, created a unique memory aid in his 2013 novel The Truth of Fact, the Truth of Feeling. The story follows a journalist who uses a device called Remem to record and instantly recall every aspect of their life. In another universe, the invention of written language caused a significant shift in a society that previously relied on oral history. If a personal, permanent memory device like Remem becomes a reality, it could have substantial consequences comparable to the invention of the internet or written texts.
And it's becoming a reality.
Prehistoric Lifelogging
In 1994, inventor Steve Mann live-streamed what he saw using a wearable camera and display, which gave birth to the genre of lifelogging in 1998. As technology advanced, devices became increasingly compact. Eventually, smartphones emerged, and most livelogging devices became obsolete. The initial objective was to Livestream events to others in real time instead of just enhancing one's memory. Smartphones were ideal for those who wanted to broadcast everything to an audience,_ leading many live bloggers to switch to using them and giving rise to a new profession known as streamers.
Then Lifelogging converted to Remembering More
With the rise of smartphones dominating the broadcasting industry, lifelogging tools have become smaller and are now being developed to serve as memory aids. A prime example of this would be Memoto's Narrative Clip in 2013, which captures a photo every 30 seconds to document an entire day.
One of the downsides of having a camera that takes photos automatically is that it can lead to thousands of photos being taken each day, making it difficult to find important information amidst all the data. However, in 2016, a more advanced version called Google Clip was released. This camera uses artificial intelligence to determine when a moment is worth capturing and takes a photo accordingly. Google Clip faded despite being quietly launched and not receiving much attention, just like hundreds of other Killed-By-Google Experiments.
The emergence of LLM Lifelogging
They mainly existed in the form of tiny cameras that were attached to the body. The main reason for this was the lack of summarization using recordings. And then, as we all know, ChatGPT came along.
The first meaningful attempt to create an LLM lifelogging device is Robert Dam's Wisper in November 2022. He integrated LLM by recording himself using a phone app. Dam believed that having a perfect memory and a personal psychologist (who could analyze his conversations for emotional monitoring) were significant advantages. It was evident that using a cell phone had its limitations. The software would have to run constantly, which would lead to issues with performance and battery life. The only feasible options were to create a separate device or to integrate the software directly into the operating system of the smartphone.
Road towards the Next Apple
Form factor entails a tectonic shift in cultures. For instance, desktops made it customary to face a wall in the corner of a room. Smartphones made it usual to handle financial transactions while walking down the street. AirPods made it natural to speak aloud as if you were speaking to yourself. Altering the form factor of gadgets has the potential to cause cultural disruptions and create new powerful entities. Currently, we are on the brink of a significant form factor shift - the pin.
Soon, we might be able to wear a tiny pin resembling an iPod shuffle around our necks, which is equipped with artificial intelligence, continuously running in the background. Depending on the device, it could even project images onto our palms. As it lacks a display, it could have an extended battery life of several days. Additionally, it could record and store all our daily activities and deliver a summary to us.
It may seem like a scene from science fiction, but these products are already in development and slated for release in the next year or so. Let's take a closer look at some examples.
In October 2023, Avi Schiffmann unveiled the device he'd been working on, the Tab.
In response, the Rewind team released the Rewind pendant.
Two pendant AI devices utilize OpenAI Whisper and GPT technology to record and respond immediately to queries. This development occurred in early October, and the competition is expected to intensify only based on previous performance records.
Ambient Computing
And the concept of ambient computing was born. Humane is a company founded by two former Apple directors in software and design. Their aim is to develop computing technology that is easily accessible but not intrusive. They have conceptualized a wearable device, resembling a pin, that can project information onto the hand using a sophisticated laser projector. This device is similar to a lifelogging device but with advanced features.
Their argument was that their technology would always be available to assist you and retain all information, without the negative side effect of digital addiction that modern smartphones often cause. They believed in the possibility of establishing a healthy relationship with electronics.
Us in the Ambient Computing Era
In the aforementioned Ted Chiang novel The Truth of Fact, the Truth of Feeling, Ted Chang writes
Psychologists make a distinction between semantic memory— knowledge of general facts—and episodic memory, or recollection of personal experiences. We've been using technological supplements for semantic memory ever since the invention of writing: first books, then search engines. By contrast, we've historically resisted such aids when it comes to episodic memory; few people have ever kept as many diaries or photo albums as they did ordinary books... We regarded our episodic memories as such an integral part of our identities that we were reluctant to externalize them, to relegate them to books on a shelf or files on a computer.
Part of me wanted to stop this, to protect children's ability to see the beginning of their lives filtered through gauze, to keep those origin stories from being replaced by cold, desaturated video. But maybe they will feel just as warmly about their lossless digital memories as I do about my imperfect, organic memories.
I'm a product of my time, and times change. We can't prevent the adoption of digital memory any more than oral cultures could stop the arrival of literacy, so the best I can do is look for something positive in it. And I think I've found the real benefit of digital memory. The point is not to prove you were right; the point is to admit you were wrong... What digital memory will do is change those stories from fabulations that emphasize our best acts and elide our worst, into ones that—I hope—acknowledge our fallibility and make us less judgmental about the fallibility of others.
I foresee a future where technology is readily available to assist humans, but not so pervasive that it distracts from human intelligence. Technology should be used as a tool, not as a hindrance. It should not overgrow like a cancerous cell, draining us of energy and focus. Instead, we should use it to our advantage while maintaining our ability to think and function independently.
Technology is not neutral in its values and impact. While some argue that it is the person behind the technology who is responsible for its use, this argument fails to account for the fact that technologies are often created based on specific demands and motivations. The initial demanders of technology are not neutral because they have a specific reason for wanting it. For instance, the inventor of nitrogen fixation, Fritz Haber, was not value-neutral when he developed chlorine gas due to his fascination with German nationalism. Haber even studied international law on using poisonous gases for lethal purposes and devised a method of applying the gas that would allow Germany to bypass it.
Social engineering, so prevalent in the modern era, is another less value-neutral area. It is known that social media companies, such as Facebook, employ teams to enhance the addictive nature of their products. This approach is not dissimilar to drug cartels who search for ways to create more potent drugs. Perhaps, in the future, we will reflect on this period as a time of barbaric social media manipulation, with unlimited regulations towards harvesting human emotions.
There are similar worries about ambient computing. If we put infinitely scrolling video feeds in our glasses or even on our irises, will our dopamine systems be safe.? I hope that these new companies will help us have a healthier relationship with technology. We failed to construct a healthy relationship during the smartphone revolution. That's why I hope that ambient computing companies
- Prioritize people's peace of mind over short-term retentions
- Fostering human creativity over frantic data collection.
- The flourishing of the human mind over the immediate metrics
I am now excited about the upcoming Ambient Computing era, in the hope that they stay ambient — readily available, but never intrusive.