$1.3 billion in GPU computing power to create the most powerful personal assistant

7 days ago • 1 pageviews

Shin Ji Won reports

Editor: Run

[Guide to New Wisdom Yuan].The founder of Inflection AI said that AI is expected to become a killer tool for solving human psychological problems. Their first generation of products already allows users to feel the warmth of the sun.

Suleyman, co-founder of DeepMind and founder of Inflection AI, said in his new book "The Coming Wave" that AI will keep humans away from psychological problems in the future!

He further explained: "I don't think we really recognize the impact of the family. Because whether you're rich or poor, no matter what ethnic background you're from, no matter what your gender is, a family that's kind and supportive of you is a huge motivator."

"I think we're in a new phase of AI development, and we have the means to provide support, encouragement, affirmation, guidance and advice [for everyone]. We distilled emotional intelligence. I think it's going to unleash the creativity of millions of people who didn't have access to that kind of opportunity before."

The reason why Suleyman has this statement may be related to his own experience:

He was born in north London in 1984 to a Syrian father and an English mother. He grew up in poverty, and when he was 16, his parents separated and both emigrated abroad, leaving him and his younger brother to fend for themselves.

He was later admitted to Oxford University to study philosophy and theology, but dropped out a year later.

"When I was in Oxford, I opened a juice milkshake stand in Camdentown. Because I was penniless, I had to stop making money all summer. At the same time, I am also doing charity."

He said charity was helping a friend set up a Muslim youth helpline to provide counseling and psychological support to young Muslims in a way that was unique to Muslims.

The 39-year-old still has no contact with his father and lives alone in California. Speaking about what he hopes AI can provide, he replied:

"Raise the upper limit of your abilities and change your perception and evaluation of yourself"

Suleyman's statement is by no means utopian, and he founded Inflection AI to develop an all-round personal assistant that solves almost every problem that everyone may encounter in life.

Inflection AI just raised more than $1.3 billion in August, valued at more than $4 billion, led by Google, Nvidia, Bill Gates and others.

AI has stronger emotional perception than people and can be used as an emotional therapy tool

At the same time, research by psychologists also supports Suleyman's claim:

The test scores empathy shown by humans in different scenarios. Test subjects were given detailed descriptions of 20 emotional situations, such as funerals, career successes or insults, and described the emotions they might feel in such situations.

The more detailed and understandable the emotional description, the higher the Emotional Awareness Level Scale (LEAS) score.

The researchers assessed ChatGPT's response using the same criteria as human responses and compared the results with previous studies conducted in France in people aged 17 to 84 years (n = 750).

In the two tests conducted, ChatGPT received high scores of 85 and 98, while human performance was completely crushed by AI. 56 points for men, 59 points for women, not even passing.

Another study, published in April in JAMA Internal Medicine, showed that ChatGPT outperformed doctors in quality and empathy when answering online questions.


The study compared ChatGPT to how well doctors performed when answering patient questions on Reddit's r/AskDocs forum.

The "cross sectional study," which involved 195 randomly selected questions, found that chatbot responses were more popular than doctors' answers. ChatGPT received higher ratings for both quality and empathy.

AI assistants could help draft responses to patient questions, which could benefit both clinicians and patients, the researchers wrote.

This technology needs to be further explored in a clinical setting, including the use of chatbots to draft responses for physicians to edit.

Randomized trials can evaluate the potential of AI assistants to improve response, reduce clinician burnout, and improve patient outcomes.

Although on the surface, both studies are still imperfect, and the researchers do not believe that AI chatbots will replace psychologists to directly see patients in the future.

But both findings point to the fact that AI chatbots can provide humans with unmatched help in mental health.

It can be said that the big language model seems to be inherently better suited to emotional understanding and communication than other productivity-enhancing applications. After all, human beings transmit feelings between each other, and language is the most important carrier.

Can Inflection AI's products solve psychological problems?

Since Suleyman is so bullish on AI's potential to solve psychological problems, how does his own product perform in this regard?

The personal assistant "Pi" launched by Inflection AI, which he founded, has been online for a few months, and now let's try out how it can help users as a personal assistant.

Go directly to his web version, although the product has been online for a while, but the page is still very simple.

According to Suleyman's news revealed in the interview, their current version of the chatbot is still relatively early, and a large number of GPUs purchased after financing are working overtime to train their latest models.

Enter Pi's chat page, click the Tian Zi grid in the lower left corner, you can see several common scenes officially prepared for users.

Each scenario is equivalent to a customized instruction, and after selecting one, a working environment will be automatically set for the chatbot.

The chatbot will also give the user an opening prompt for each scenario, such as how to start the chat after I select "motive myself".

First of all, the editor tried its Chinese ability, and set a scene for him with Chinese to comfort me.

It seems that Pi can understand Chinese, but Chinese expression is not true, and every time he is clipped halfway through the conversation.

The editor tried it in English again, and the effect was really much better.

It does understand what I'm saying very well, and gives some advice that I personally feel will be useful to improve my mental state.

Want to give him some intensity and see if he can make me feel more comfortable if I'm really not feeling well?

Sure enough, it gave some heartwarming comfort and advice.

So I continued to sell misery to see how he could comfort me.

Indeed, there is still a set of comforting people, even if it is a hypothetical scene, I can still feel that if I am really in the described scene, I am still comforted.

So in response to the advice he gave me, I continued to ask him for help.

Despite my seemingly vexatious requests, he patiently helped me solve the problem, both in tone and content.

In addition to the spiritual comfort, I asked it how to do a small business to save the emergency, and the advice it gave was quite reliable.

Doing some training, or using your professional ability to do a part-time job, is a very operational way to get money for a newly unemployed worker.

It can be said that providing reliable opinions and exporting heart-warming words are equally important for a person who wants to get out of a psychological dilemma. The Pi is doing well with both of these points right now.

We also expect that in the future, Inflection AI can use the computing power in hand to launch better personal assistant products, especially to solve multi-language capabilities, so that future human beings can equally obtain psychological support and enter an era without psychological problems.