2026³â 05¿ù 08ÀÏ ±Ý¿äÀÏ
 
 
  ÇöÀçÀ§Ä¡ > ´º½ºÁö´åÄÄ > Science & Technology

·£¼¶¿þ¾îºÎÅÍ µÅÁöµµ»ì±îÁö... ³ë·ÃÇØÁø »ç±âÇà°¢

 

Á¤Ä¡

 

°æÁ¦

 

»çȸ

 

»ýȰ

 

¹®È­

 

±¹Á¦

 

°úÇбâ¼ú

 

¿¬¿¹

 

½ºÆ÷Ã÷

 

ÀÚµ¿Â÷

 

ºÎµ¿»ê

 

°æ¿µ

 

¿µ¾÷

 

¹Ìµð¾î

 

½Å»óǰ

 

±³À°

 

ÇÐȸ

 

½Å°£

 

°øÁö»çÇ×

 

Ä®·³

 

Ä·ÆäÀÎ
Çѻ츲 ¡®¿ì¸®´Â ÇѽҸ²¡¯ ½Ò ¼Òºñ Ä·ÆäÀÎ ½Ã...
1000¸¸¿øÂ¥¸® Àΰø¿Í¿ì, °Ç°­º¸Çè Áö¿ø ¡®Æò...
- - - - - - -
 

TII unveils Falcon Arabic, the first Arabic AI model in the Falcon series, and Falcon-H1, a high-performance model redefining AI efficiency

Falcon Arabic: In addition to English and European-origin languages, Falcon now integrates Arabic - expanding its reach across the Arabic-speaking world as the region¡¯s best-performing Arabic AI model
´º½ºÀÏÀÚ: 2025-06-07

ABU DHABI, UNITED ARAB EMIRATES -- The UAE’s Technology Innovation Institute (TII), the applied research arm of Abu Dhabi’s Advanced Technology Research Council (ATRC), unveiled two major AI advancements: Falcon Arabic, the first-ever Arabic language model in the Falcon series - now the best-performing Arabic AI model in the region - and Falcon-H1, a new model that redefines performance and portability through a new architectural design. In the small-to-medium size category of AI models (30 to 70 billion parameters), Falcon-H1 outperforms comparable offerings from Meta’s LlaMA and Alibaba’s Qwen, enabling real-world AI on everyday devices and in resource-limited settings. The announcement was made during a keynote address by H.E. Faisal Al Bannai, Advisor to the UAE President and Secretary General of ATRC, at the Make it in the Emirates event.

Built on top of Falcon 3-7B (7-billion-parameter), Falcon Arabic is one of the most advanced Arabic AI models developed to date. Trained on a high-quality native (non-translated) Arabic dataset spanning Modern Standard Arabic and regional dialects, it captures the full linguistic diversity of the Arab world. According to the Open Arabic LLM Leaderboard benchmarks, Falcon Arabic outperforms all other regionally available Arabic language models, reinforcing its leadership in sovereign, multilingual AI. It ranks as the best-performing Arabic model in its class, matching the performance of models up to 10 times its size, proving that smart architecture can outperform sheer scale.

Separately, the newly launched Falcon-H1 model is designed to dramatically expand access to high-performance AI by reducing the computing power and technical expertise traditionally required to run advanced systems. The announcement builds on the success of TII’s Falcon 3 series, which ranked among the top global AI models capable of operating on a single graphics processing unit (GPU), a major breakthrough that enabled developers, startups, and institutions without high-end infrastructure to deploy cutting-edge AI, affordably.

“We’re proud to finally bring Arabic to Falcon, and prouder still that the best-performing large language model in the Arab world was built in the UAE,” said H.E. Faisal Al Bannai at the Make it in the Emirates event in Abu Dhabi. Commenting on Falcon-H1, he said: “Today, AI leadership is not about scale for the sake of scale. It is about making powerful tools useful, usable, and universal. Falcon-H1 reflects our commitment to delivering AI that works for everyone - not just the few.”

Falcon-H1 continues to support European-origin languages and for the first time has scalable capability to support over 100 languages, thanks to a multilingual tokenizer trained on diverse datasets.

Smarter, Simpler, and More Inclusive

Falcon-H1 was developed to meet the growing global demand for efficient, flexible, and easy-to-use AI systems. Falcon-H1, named ‘H’ for its hybrid architecture combining the strengths of Transformers and Mamba, enables significantly faster inference speeds and lower memory consumption, while maintaining high performance across a range of benchmarks.

“We approached Falcon-H1 not just as a research milestone but as an engineering challenge: how to deliver exceptional efficiency without compromise,” said Dr. Najwa Aaraj, CEO of TII. “This model reflects our commitment to building technically rigorous systems with real-world utility. Falcon isn’t just a model; it’s a foundation that empowers researchers, developers, and innovators, especially in environments where resources are limited but ambitions are not.”

The Falcon-H1 family includes models of various sizes: 34B, 7B, 3B, 1.5B, 1.5B-deep, and 500M. These models offer users a wide range of performance-to-efficiency ratios, allowing developers to choose the most appropriate model for their deployment scenarios. While the smaller models enable deployment on constrained edge devices, the flagship 34B model outperforms like-models from Meta’s LlaMA and Alibaba’s Qwen on complex tasks.

“The Falcon-H1 series demonstrates how new architectures can unlock new opportunities in AI training while showcasing the potential of ultra-compact models,” said Dr. Hakim Hacid, Chief Researcher at the AI and Digital Science Research Center at TII. “This fundamentally shifts what’s possible at the smallest scale, enabling powerful AI on edge devices where privacy, efficiency, and low latency are critical. Our focus has been on reducing complexity without compromising capability.”

Each model in the Falcon-H1 family surpasses other models that are twice its size, setting a new standard for performance-to-efficiency ratios. The models additionally excel in mathematics, reasoning, coding, long-context understanding, and multilingual tasks.

International Impact

Falcon models are already powering real-world applications. In partnership with the Bill & Melinda Gates Foundation, Falcon has supported the development of AgriLLM, a solution that helps farmers make smarter decisions under challenging climate conditions. TII’s Falcon ecosystem has been downloaded over 55 million times globally and is widely regarded as the most powerful and consistently high-performing family of open AI models to emerge from the Middle East region.

While many AI models focus on narrow consumer use cases, TII has prioritized building foundational models that can be adapted to meet the demanding needs of industry, research, and public good, without compromising on accessibility. These models are designed to be applied across a variety of real-world scenarios, remaining accessible, resource-efficient, and adaptable to different environments.

All Falcon models are open source and available on Hugging Face and FalconLLM.TII.ae under the TII Falcon License, an Apache 2.0-based license, which promotes responsible and ethical AI development.



 Àüü´º½º¸ñ·ÏÀ¸·Î

Singapore-Based WPH Digital Achieves ISO/IEC 42001:2023, Asia¡¯s First AI Governance Milestone in Oil & Gas
India¡¯s smartphone shipments fell 5% in 1Q26 amid channel caution and pricing pressures
SES¡¯s O3b mPOWER Satellite Network to Connect Seven New Petrobras FPSOs
Quectel Expands Small Cell Antennas Portfolio With Five New Products
Mainland China Cloud Infrastructure Spending Rises 26% in Q4 2025, Driven by AI and Agent Growth
Quectel Introduces FGH200M Wi-Fi HaLow Module for massive IoT Deployments
AMOLED Smartphone Display Shipments Expected to Decline Sharply in 2026

 

Axelspace-Led Consortium Chosen for Japan Space Fund Project on Next-G...
Biocytogen and Sihuan Pharmaceutical Announce Strategic Partnership in...
bet365 Partners with TestMu AI to Accelerate Global Release Velocity w...
Boomi Builds Analyst Momentum Across Integration, API Management, Data...
Quectel Expands Mid-tier 5G Portfolio with New RedCap 3GPP Release 17 ...
Takeda¡¯s Zasocitinib Shows Rapid, Durable Skin Clearance in Once-Dail...
Regnology Announces Next-Generation Ascend Platform with Agentic AI, A...

 


°øÁö»çÇ×
'º£³×ÀÍ' Áß¹® Ç¥±â 宝Ò¬ìÌ, 'À̺ñÁî: ÀÌÁö' Áß¹® Ç¥±â æ¶币òª...
¿¡³ÊÀ¯ Enereu 额Òö äþÒö
¾Ë¸®¾Ë Allial Áß¹® Ç¥±â ä¹××尔 ä¹××ì³
´º½ºÁö Áß¹®Ç¥±â´Â À½Â÷ Ç¥±â¹æ½Ä '纽ÞÙó¢ ´Ï¿ì½ºÁö'
¹Ìµð¾î¾Æ¿ì¾î Mediaour ØÚ体ä²们 ØÚô÷ä²Ùú MO ¿¥¿À ØÚä² ØÚä²
¾Ë¸®À¯ºñ Alliuv ä¹备: ä¹êó备, ¾Ë¶ã Althle ä¹÷åìÌ
´ºÆÛ½ºÆ® New1st Áß¹® Ç¥±â 纽ììãæ(¹øÃ¼ Òïììãæ), N1 纽1
¿£ÄÚ½º¸ð½º : À̾¾ 'EnCosmos : EC' Áß¹® Ç¥±â ì¤ñµ
¾ÆÀ̵ð¾î·Ð Idearon Áß¹® Ç¥±â ì¤îè论 ì¤îèÖå
¾ËÇÁ·Ò Alfrom Áß¹® Ç¥±â ä¹尔ÜØ ä¹ì³ÜØ
´º½º±×·ì Á¤º¸ ¹Ìµð¾î ºÎ¹® »óÇ¥µî·Ï
¾ËÇÁ·Ò °è¿­ »óÇ¥, »óÇ¥µî·Ï ¿Ï·á

 

ȸ»ç¼Ò°³ | ÀÎÀçä¿ë | ÀÌ¿ë¾à°ü | °³ÀÎÁ¤º¸Ãë±Þ¹æÄ§ | û¼Ò³âº¸È£Á¤Ã¥ | Ã¥ÀÓÇѰè¿Í ¹ýÀû°íÁö | À̸ÞÀÏÁÖ¼Ò¹«´Ü¼öÁý°ÅºÎ | °í°´¼¾ÅÍ

±â»çÁ¦º¸ À̸ÞÀÏ news@newsji.com, ÀüÈ­ 050 2222 0002, ÆÑ½º 050 2222 0111, ÁÖ¼Ò : ¼­¿ï ±¸·Î±¸ °¡¸¶»ê·Î 27±æ 60 1-37È£

ÀÎÅͳݴº½º¼­ºñ½º»ç¾÷µî·Ï : ¼­¿ï ÀÚ00447, µî·ÏÀÏÀÚ : 2013.12.23., ´º½º¹è¿­ ¹× û¼Ò³âº¸È£ÀÇ Ã¥ÀÓ : ´ëÇ¥ CEO

Copyright ¨Ï All rights reserved..