After Meta unveiled the artificial intelligence model Llama earlier this year, it also announced the launch ofllama 2, and announced that they would work with Microsoft,QualcommCollaborate to apply this artificial intelligence model to cloud services and mobile products respectively.
Llama 2 increases the training parameter size to 2 trillion sets, more than double that of Llama, and also relaxes the context length limit by half.
In addition to maintaining its open source architecture design, it also maintains its application flexibility, corresponding to 70 billion, 130 billion, and 700 billion parameter operation modes respectively, thus retaining its deployment flexibility. For example, the announced collaboration with Microsoft will not only be applied to its Azure cloud platform services, but also to Qualcomm processor products, allowing mobile phones and PCs using Qualcomm processors to unleash greater artificial intelligence computing capabilities through Llama 2.
In terms of cooperation with Microsoft, Llama 2 will be integrated into the Azure cloud service and will also be available in the Windows operating system. In addition, it has previously cooperated with OpenAI on GPT-3, GPT-4, and ChatGPT, which means that Microsoft will maintain greater application flexibility in artificial intelligence in the future.
As for the cooperation with Qualcomm, it is expected that mobile phones and PCs equipped with Qualcomm processors launched in 2024 will be able to use Llama 2, which means that QualcommThe next-generation Snapdragon processor is expected to be unveiled in October this yearLlama 2 will be directly integrated to drive on-device artificial intelligence computing.


