Velvet
Artificial Intelligence. Made Human
Velvet is a family of LLMs natively developed by Almawave, built on a proprietary architecture that does not rely on pre-existing models.
Each model is designed to optimize performance while minimizing energy consumption and costs, making AI accessible and sustainable, in full compliance with European regulatory frameworks.
The family includes both enterprise-hosted models and open-weights models: the enterprise versions offer advanced capabilities to regulated environments, while the open-weights models reflect Almawave’s strong commitment to open-source development.
AI that makes the difference
Velvet is designed with a natively multilingual, non-Anglocentric approach, aiming to preserve the linguistic and cultural specificities of each European country.
It stands out for its advanced reasoning capabilities and sophisticated functions, including the optimized management and selection of the AI agents best suited to execute specific tasks.
Native integration with AIWave, the multi-model and multi-agent platform, allows for seamless adaptation of the models to different vertical sectors, extending their applicability to a broad range of business use cases.
In highly regulated domains such as Healthcare, Welfare, Justice, Security, Mobility, and Finance, Velvet’s sector adaptability ensures careful context management, protection of sensitive information, and data reliability.
At the same time, its effectiveness across use cases is coupled with an ongoing commitment to energy efficiency and sustainability.
Our offering
VELVET 2B
Velvet 2B is the updated version of the smallest model in the family. It is designed to effectively adapt to specific tasks and a wide range of use contexts. With its compact size and high level of optimization, Velvet 2B delivers reliable performance even in resource-constrained environments or edge deployment scenarios.
VELVET SPEECH 2B
A compact and versatile language model designed for dynamic real-time interactions, Velvet Speech 2B processes and understands spoken language. Featuring an integrated speech recognition system, this model enables seamless audio management in applications such as voice commands, conversational interfaces, and assistive technologies.
VELVET 14B
A 14-billion parameter Large Language Model, operating in Italian and five other European languages (German, Spanish, French, Portuguese, and English), with a 127 thousand word vocabulary and a 128 thousand token context window making it suitable for the most complex document sets. Trained on over 4 trillion tokens, the model was developed to adapt efficiently to a range of vertical industries.
VELVET 25B
The 25-billion parameter Velvet 25B model excels in processing very large and complex texts—such as legal documents, scientific dossiers, or legislative acts—while maintaining consistency and precision across distant passages.
Trained in a natively multilingual fashion, it’s optimized for text processing in all 24 official EU languages.
Velvet 25B is designed to maximize performance while minimizing energy consumption and costs, and it can be deployed even on a single GPU, making AI accessible not only to large organizations but also to entities with limited infrastructure.
ARTIFICIAL INTELLIGENCE. MADE HUMAN
Why choose Velvet?
AWARE
Designed for the Italian and European cultural and regulatory context, it is trained on selected data to minimize bias and poses no systemic risk under the AI Act. It also prioritizes the confidentiality of personal data through a proprietary algorithm that allows data removal without the need for retraining.
LIGHTWEIGHT
Developed for sustainable use according to a “Green AI” approach, it provides effective but low-power models capable of reducing training and operating costs. It is ready for deployment in the cloud, on major platforms, and on-premise.
AGILE
Designed to easily adapt to specific domains and industry languages, it is able to connect to internal business processes and data, while also integrating into specialized and “ready-to-use” applications.