Introducing Jamba:
A new hybrid SSM-Transformer model.
🎉
Just released! Jamba-Instruct is now available in public preview.
Try it now
Product
AI21 Studio
Foundation Models
Task-Specific Models
Products
Models
Foundation Models
Task-Specific Models
Deployment Options
Platform
Partners
Virtual Private Cloud
Solutions
Industry
Finance
Retail
Healthcare
Impact
Employee Productivity
Task Automation
Risk Mitigation
Personalized Customer Support
Sales Conversion
Company
About Us
Blog
Careers
Let's talk
Start Building
Explore Use Cases
Let's talk
Start Building
Explore Use Cases
Research
All Categories
Research
Events
Tutorial
Case study
Publications
Announcements
Blog posts
Talks
Research
SenseBERT: Driving Some Sense into BERT
SenseBERT: pre-trained to predict not only the masked words but also their WordNet supersenses.
Research
Auxiliary Tuning and its Application to Conditional Text Generation
Auxiliary Tuning is an efficient method for adapting a pre-trained language model to a novel task, such as conditional text generation.