Thinking about the future of AI business with AI specialist media "Ledge.ai"

With "generative AI" as a keyword, the business environment surrounding AI has changed dramatically. While it has become easier for anyone to use, the way data is handled is key to using it more advancedly and setting yourself apart from others. In this article, we will introduce the importance of data in the age of generative AI and Macnica 's approach, together with Ledge, the operator of Ledge.ai, a media outlet that covers a wide range of AI information.

AI spreads through "penetration and diffusion"

Minobe: It's been about seven years since our media started operating, but we believe that 2023 was a year of great excitement in the industry, with the "penetration and spread" of AI, especially in the case of various generative AI. In particular, ChatGPT saw the evolution of the product itself, as well as companies struggling with how to use it and political and administrative activity becoming more active.

Onishi: This was one of the biggest moments in Ledge's seven-year history with Ledge.ai.

Minobe: That's right. It's not yet clear whether the current trend can be called the fourth AI boom, but the first and second AI booms were marked by a period of decline when people thought that "AI is not yet good enough for practical use." In the midst of all this, the third AI boom, based on a technology called deep learning, continued for a long time, and it's rare that a new trend in generative AI has emerged from that without another period of decline.

Until now, discussions about AI have centered on how to create it and how to improve the accuracy of the model, and I think there are still many people who think that deep learning sounds a bit difficult when they hear words like that. However, ChatGPT, as its name suggests, allows you to use advanced AI in a chat-like manner, so its user base is expanding, and I feel that the point of discussion is changing to how to use it. Another big change is that it is now possible to customize and use large-scale pre-trained models, such as OpenAI's GPT series.

Onishi: As expected, it is difficult to master it. We are also thinking about various things together with our customers.

Minobe: Let's delve a little deeper into mastering AI. First, there are two key words to keep in mind going forward: "fine tuning" and "RAG." The former involves improving a pre-trained model to better fit the purpose by incorporating additional and post-training learning. The latter is a method of connecting to an external database to improve the accuracy of answers to questions.

The commentary below is quoted from a media outlet we manage. In the future, when considering data usage methods, development approaches, and task characteristics, we believe it will be important to utilize RAG and fine tuning while mastering the base model. In addition, the use of in-house data for generative AI models is a big topic, and the era of "penetration and diffusion" is likely to continue even after 2024.

Onishi: Mr. Minobe, do you imagine that there will be bigger trends from 2023 to 2024?

Minobe: I think so. In 2023, we saw a lot of activity from companies and pioneering people who were quick to realize the value of AI. However, in the future, we expect more companies, large and small, to get involved, so I think that in 2024 the volume will be larger and more interesting ideas will emerge.

How companies should approach generative AI

Minobe: From now on, we will be discussing the main theme of "What is important to turn data into business value?" Currently, the recognition that "AI is amazing and useful" is spreading among individuals, and I think that companies should use it based on that premise. Mr. Onishi, how do you think companies should approach generative AI and the latest technology?

Onishi: As mentioned earlier, I think that after 2024, it will be important to think about what kind of value we can create in our own business and operations. From what I have heard from customers, it seems that many people are using ChatGPT but have not yet measured the impact on their business. How users can overcome these challenges will be the key to AI's breakthrough in the future.

Minobe: So those who have fully faced and realized the value of AI are able to use it effectively. When we asked our customers at an event we held, "Has anyone here used ChatGPT?", everyone raised their hands, but when we added "every day," the number of people who had used it suddenly decreased. In other words, those people had dropped out at some point. This is the nature of a boom.

However, I think that AI will eventually become as popular as Google search. Just as people recognize the value of search results and use search engines when they want to look something up, it is important to consider what value AI can bring to your company.

Onishi: On the other hand, it is true that "it is not as easy as you think." At first, I had the expectation that if I input data that matches what I want to do, it will return a highly accurate answer. However, in reality, it was necessary to define the business issues after carefully building a model. I think that it is necessary to break down these points well, find a practical compromise and implement it, so to speak, to face reality.

Minobe: It is true that the value that "AI is omnipotent" is common. It is important to communicate to the relevant parties that AI has learned about a specific task and to use it in business only after a certain level of literacy has been acquired. I think that is the first step, but what are your thoughts on this, Mr. Onishi?

Onishi: I think there is a strong demand for people to effectively combine business and systems, create experiences of success and value gained through their work, even if they are very small, and experience them for themselves.

Minobe: Quick wins and total planning. When you decide to use something, you may start with the overall design, but I think it's better to take a lighter approach to AI and start small. For example, the experience of asking ChatGPT a question and getting an immediate response is interesting, and I feel that it lowers the barrier to getting involved with AI. I think there are already many companies considering using it, and AI is likely to become a central part of future business.

Onishi: That's for sure.

Minobe: ChatGPT and other generative AIs are based on various platforms, so users can use them in unison. What are the key points for incorporating the source of your company's competitive edge from here?

Ohnishi: In the past, the importance of AI was placed on the model itself, but nowadays, as models can be tuned for use, the importance of data has increased dramatically. However, since large-scale language models incorporate public information from around the world, I think that in order to provide new value, it will be necessary to prepare original data.

Minobe: When we support our customers' projects, we are often asked, "What kind of data is valuable?" For example, we believe that even data that may not seem valuable at first glance, such as attendance logs or inventory receipt and disbursement history, can be made valuable depending on how it is used. In conventional AI, the main functions were prediction, classification, and extraction, but with generative AI, everyday work logs should basically be able to be turned into value.

Onishi: Our company handles a variety of IT products, including those made overseas, and we believe that we can utilize not only the data and knowledge we gain from them, but also the methods for troubleshooting products. For example, even in product support, the way of SQL programming is generally used, but I think the key is to limit the scope to security, incorporate individual cases well, and then generalize it from there. And now it is becoming easier to provide the data we have gathered in this way to external parties through large-scale language models.

Minobe: In the era of deep learning, before the advent of generative AI, there was a keyword called "big data." This is not something that is generated within a single company, but is formed by small pieces of data gathering in a spiral fashion, so quick wins and small starts are important.

Ohnishi: I feel that in the future, the trend, both overseas and elsewhere, will be to use lightweight, open-source, large-scale language models and develop them individually in-house.

How to use valuable data

Minobe: There are still many aspects of AI trends in 2024 that are unclear. However, the use of generative AI in companies is not just a top-down decision, but there has also been active movement in which employees have used it in their work based on their own ideas in idea contests, etc., so I think that this will continue to develop in the future. On the other hand, I think that there are many people who are struggling with how to store, find, and use valuable data. What are your thoughts on this?

Onishi: In terms of how to store data, there may be many cases where the data is already there. For example, in the manufacturing industry, data is available in various places, such as call center information, development logs, and sensor logs, but it is scattered and takes time to connect, and there are issues such as the need for real-time performance when trying to create value. Also, since segmented data is often not accessible to different departments, consolidating stored data seems to be a key point.

Minobe: So, to make it usable, you first need to connect it. I think generative AI is definitely moving towards replacing what we call an inquiry interface. The idea is that when you talk to the assistant GPT, it will search for and tell you about internal company information, or start a workflow. In that case, it will be necessary to prepare a database for the connection and to acquire and manage logs.

Ohnishi: That kind of format will be easier to link to future value creation. There is no doubt that the world will be based on AI in five or ten years, so it may be difficult to integrate all the data at this point, but it would be good to build a foundation little by little while working backwards to see how it will be used.

Minobe: In 2023, generative AI was said to be amazing, but the main areas in which it was used were improving business efficiency and reducing costs. From 2024, attention will be focused on data quality, and by connecting it to output technology, the value of AI in business will become more apparent. For example, if a company with 5,000 employees could reduce the labor hours of each person by one minute per day, that would be a reduction of 5,000 minutes. In reality, innovation is likely to occur in units of one or two days, so the impact is immeasurable. However, the use of AI also comes with certain risks. As Macnica, do you have any thoughts on this?

Onishi: Guidelines for the future use of generative AI have been issued in the United States, Europe, China, Japan, and other countries. It is very important to protect our own data while keeping these in mind and to use it for business purposes while taking into account the social situation. To do this, we need to strengthen security. We would like to make good use of AI TRiSM and operate it with low load, and we would like to introduce it to our customers.

Minobe: There are also increasing risks specific to generative AI, such as hallucination. This time, we have been talking on the premise that we will utilize AI, but we need to take comprehensive care, including security and liability issues.