Since spring this year, Facebook has been testing a new tool in the form of a chatbot. They were called Liam Bot. However, it is not intended for public use, but only for internal use by employees. What exactly is its role and why was it created at all? In recent years, Facebook has been struggling with a kind of image crisis, and this is due to a few dirty games on its part and smaller or larger scandals. A few months ago, there was information about an implicit change of the slogan on the login page – you can read about it here. In 2018, a lawsuit against Facebook was launched because it allowed for a situation in which as many as 87 million data leaked to Cambridge Analytica. In addition, the giant is also accused of lacking any countermeasures to prevent the spread of fake news. Objections are made not only by users, but also by the current employees of the concern, who, due to the accumulating negatives, started to think about changing jobs, and potential candidates who do not want to submit their applications against the accusations against Facebook. People who have distanced themselves from Facebook are also family members and friends of its current employees, who are fighting with all the allegations, and their task is, inter alia, take care of the company’s good name. It is difficult to be completely fair to your relatives and the employer at the same time – not to slander the company and not to frown upon your family and friends. Such non-obvious situations occur mainly during the holiday season, because then people meet in larger groups and talk about what is going on in life. Facebook employees reported the problem to their employer, admitting that they were unable to provide a balanced and diplomatic answer to questions from their relatives about the above-mentioned crisis situations, which are usually full of accusations. Therefore, the “mountain” decided to use artificial intelligence, which is to serve as a center to help employees. Since Liam Bot was released just before the American Thanksgiving, interested parties had the first real opportunity to test its capabilities. When an inquisitive cousin asks what Facebook is ultimately going to do with this ubiquitous hate speech, chatbot prompts the answer: “Facebook has already hired more moderators to control content on the platform more effectively.” In some cases, the bot refers employees to articles on a Facebook blog, help center or FAQ, where the employee is able to quickly find information on a given issue. Hence the path is easy: either he or she will read the matter thoroughly and give a comprehensive answer to the concerned family member, or he will send him to the source so that he can draw conclusions on his own. Facebook’s response to employees’ requests was therefore met, it seems that even to a greater extent than anyone thought. We will probably find out whether the chatbot actually fulfills its role not earlier than after Christmas – a time particularly abundant in family meetings and heated discussions at the table.