NEC will showcase its internal use during CEATEC 2023Large-scale natural language model "NEC Generative AI Framework"It is mainly supported by a supercomputer with 928 GPUs and 580 PFLOPS computing power, and is claimed to have sufficient Japanese language understanding and response capabilities. The total number of parameters is approximately 130 billion groups.
Compared to OpenAI's GPT-3, which has approximately 1750 billion parameters, and Meta's Llama2, which has approximately 700 billion parameters, NEC's large-scale natural language model is smaller in scale, and its overall execution response speed is around 10 seconds before it provides an answer. However, NEC believes that large-scale natural language models of this size are relatively easy to operate and fine-tune, while also achieving a balance between accuracy and cost-effectiveness in operation.
Like other large-scale natural language models, NEC emphasizes that the "NEC Generative AI Framework" can also be deployed on the device side, NEC data center, or Microsoft Azure public cloud environment.
Compared to Azure OpenAI services, which already use the GPT-4 large-scale natural language model, NEC explains that the "NEC Generative AI Framework" offers greater deployment flexibility, a better understanding of Japanese content, and allows users to tailor their needs to public cloud artificial intelligence applications to accommodate different data processing methods.
NEC has already provided its large-scale natural language model to around 10 companies and plans to cooperate with more manufacturing and financial companies. It is also preparing to expand to more markets within Japan, but has not specified specific development goals.




