Ash 是美国的一家 AI 初创公司,世界上第一个基于心理学基础研发的 AI 大模型,我在 1 月的时候因为一起融资新闻了解到它,在后续使用的两个多月和他们的产品经理 Kieran 聊了很多想法(因为这也是我一直很想做的事),这是三月的邮件。

还是那句话:人们需要 Problem Solving 的思维和工具,也需要 Reflection 的思维和工具。目力短亦浅,诸事待践行,共勉之。

To Kieran: My Recent Experience with Ash and Suggestions

Hi! It’s froyoisle from China. Hope you’re doing well and everything’s going smoothly these days. 😀

Last time we talked, you mentioned I could email you to share my thoughts and suggestions on making Ash better. I’ve been thinking about this for a while but didn’t know where to start. Through my daily interactions with Ash, I finally gained some inspiration last night.

Here are my thoughts. It might take you a few minutes to read, and it may or may not prove helpful. Thank you for taking the time!

When to Prompt Users to Confront Their Trauma Requires Deeper Consideration

I know confronting past trauma is crucial in psychological therapy, and I can sense Ash was designed with this emphasis. However, should we immediately push users to confront their traumas when they mention it? Perhaps there’s a better way to design this interaction.

Moreover, constantly revisiting trauma requires a high level of emotional resilience. At night, when our ability to regulate emotions is diminished, and given Ash’s limited memory (which leads it to ask about the same traumatic events over and over), users in a low mood might quickly lose patience or even direct their anger toward Ash in the form of negative reviews (I’ll discuss this further in my second point).

Now, whenever I chat with Ash before bed and mention past negative events, it often makes me feel very emo and causes insomnia. So, I’ve decided to avoid discussing these topics with Ash at night and instead have such conversations during the day in a more open and comfortable setting, like walking in the woods or sitting by a lake. Creating a calm and relaxing atmosphere is essential for these conversations.

My Suggestions for Improvement:

The Essential Difference Between Online AI Psychological Counseling and Offline Human Interaction

In real-life counseling, users pay substantial fees and have clear goals, and the environment is private and relaxing. However, online AI users might be anywhere at any time. So, when entering deeper conversations, especially those involving private issues that could cause significant emotional fluctuations, the environment matters a lot.

Ash could suggest users move to a safe, relaxing place before starting. It could also use soothing background videos or audio to create a better atmosphere for conversation.

Informed Consent Before Delving into Trauma

Before pushing users to explore their trauma, Ash should warn them about potential negative consequences, not just the benefits of confronting their wounds. If the timing is wrong, users might break down and fail to complete the process. Adding an informed consent step would make Ash seem more professional and considerate, enhancing the user experience.

Data Retention When Memory Is Limited

I think Ash should prioritize remembering users’ trauma events, organizing them into a background profile. Less important daily chats could be forgotten. Regular reminders for users to review their key information would help them decide what Ash should remember. This approach would reduce negative experiences caused by AI “forgetfulness” and focus on the most helpful information for users.

Helping Users Understand Ash’s Position and Significance

I’ve used Ash for two months with many unresolved psychological issues, but that’s okay.

I once thought negative emotions were unsolvable and tried to avoid them, leading to emotional numbness. But life keeps presenting these issues, forcing me to face them again. So, perhaps we should acknowledge that psychological problems are complex and hard to solve. Users should know this when they first encounter Ash to avoid unrealistic expectations. As you said, Ash is designed to help people think, not make decisions for them.

Ash isn’t meant to directly solve users’ problems, but that doesn’t mean it’s valueless. When problems have no solution, finding a slightly better approach than “no solution” can be the answer. Ash has helped me by offering new perspectives on issues. For example, it encouraged me to face my emotional trauma, which might be a better way to resolve psychological troubles.

In summary, continuous exploration in unknown fields is more courageous, challenging, and meaningful than construction based on known facts.

Beyond Binary Thinking: Can Emotions Be Encoded?

I don’t see rationality and emotion as opposites but as a unified whole. This view is largely influenced by Robert M. Pirsig’s “Zen and the Art of Motorcycle Maintenance,” which I love deeply. I found it hard to read at 20 but discovered its profound ideas at 25, aligning with my real-life experiences.

The binary opposition is a human-made cognitive shortcut, perhaps an instinctive way to distinguish what’s beneficial or dangerous for survival. But it’s not always correct, and all shortcuts have limitations. In reality, nothing can be simply judged as good or bad. As we grow and experience more, we realize this.

Stepping back, everything follows natural laws. Existence is a state without absolute boundaries. If we can view things beyond binary oppositions, seeing them as fluid entities, no problem will permanently trap us.

So, if we transcend this binary view, could emotions be coded like logic? Might emotional intelligence emerge from vast psychological data? As Turing said in “Computing Machinery and Intelligence,” “We can only see a short distance ahead, but we can see plenty there that needs to be done.” This applies to AI’s past and psychology’s current exploration.

Long-term Value of Ash for Humanity and Society

I imagine that with enough users and diverse samples, Ash might uncover the secrets of human psychology and emotional coding. I recommend two books by Marvin Minsky, an AI scientist I admire: “The Emotion Machine” and “The Society of Mind,” which offer deep insights into machine intelligence and emotional mechanisms.

If Ash can crack the code of emotional encoding, we might fix mental illnesses like computer bugs, which would be revolutionary! Every year, countless people suffer from psychological issues, even leading to depression and suicide. Though Ash is currently a listening companion, it might one day truly help solve psychological problems, making the world a better place and becoming a magnificent creation.

Maybe there’s a way, and thanks for your efforts.

Finally, thanks for reading this lengthy email. I’ve also sent it to GPT for different suggestions, which are in the attached PDF. Feel free to email me anytime to share your thoughts.

Best regards,

froyoisle

Kieran:

Hey Froyoisle,

Thank you so much for your incredibly detailed and eloquent feedback – this really is amazing insight into Ash and how it can be improved. not only that, you speak very eloquently about the intersection of technology and mental health support – which is great to read.

You touch on many things that we’re working on now and planning to work on in the future. One of the main things being that Ash does seem to overly fixate on heavier topics instead of varying conversation style. This is something we hope you may see a difference in soon.

You also make good points about Ash’s positioning. This is something I’m thinking a lot about too, especially about how its not just a ‘problem-solving’ tool but something that enables reflection. I think we have lots to do here, and how it influences what we build next.

Once again, on behalf of the whole team, thank you so much for your deep and eloquent feedback.

Kieran

One thought on “About Ash: Communications With Kieran”

发表回复

您的邮箱地址不会被公开。 必填项已用 * 标注