In recent years, computational linguistics has witnessed significant advancements in developing language models (LMs) capable of processing multiple languages simultaneously. This evolution is crucial in today’s globalized world, where effective communication across diverse linguistic boundaries is essential. Multilingual Large Language Models (MLLMs) are at the forefront of this development, offering solutions that cater to the complex needs of multilingual understanding and generation. The primary challenge that MLLMs address is the effective processing and generation of text across various languages, including those with limited resources. Traditionally, LMs have been predominantly developed for high-resource languages, such as English, which has left a