Meta went public with significant advancements made to AI at Meta Connect 2024 which saw an improvement in functionality on smart glasses and AI devices. All these changes are seen to provide longer-term benefits that are aimed at enhancing the usability of Meta AI to more users across the globe. In addition, Meta promotes new possibilities to interact with AI by incorporating the features into common products and solutions.

Among them, it is possible to mention the addition of the new feature – multilingual support. This AI now launches in seven more languages aside from English and Spanish in Latin American countries including Argentina, Chile, Colombia, Ecuador, Mexico, Peru, and Cameroon. These expansions enable a range of users to be incorporated into the platform thereby enabling Meta to expand its influence across continents and cultures.
What Creative Options Offer by New Meta AI?
Besides language support, Meta has also provided creative editing features. Users can currently apply minor modifications to their images through voice commands by adding or deleting objects, elements, or even complete scenes. For instance, a user can swap a cat with a corgi by issuing a simple command, which adds a sense of enjoyment for the human user by making interactions with AI to be enjoyable.
Furthermore, the incorporation of the Llama 405B model advances the coding, reasoning, and instruction-tuning system of these AI. This improves on the earlier models by increasing the computing power that this model brings to what this AI can accomplish, making it an excellent tool for developers dealing with intricate projects.
Additionally, Meta is diversifying device compatibility, allowing users of this AI to use voice commands through Ray-Ban Smart Glasses. These will be features such as real-time translations and reminders and other features that are to be integrated in the future such as real-time AI video processing making the user interactive experience smooth and revolutionary.