Last Week in AI #263: Apple shares details of its multimodal model, Cerebras unveils largest AI chip yet, Microsoft "acquires" Inflection AI, Stability AI CEO steps down, and more!
Apple shares how it trained its multimodal foundation model, Cerebras' new chip can train models with 24 trillion parameters, large departures and personnel shifts from Inflection AI and Stability AI
Top News
MM1: Methods, Analysis & Insights from Multimodal LLM Pre-training
Apple released a paper detailing its efforts in training Multimodal Large Language Models (MLLMs), focusing on the significance of different architecture components and data choices. The authors found that a balanced mix of image-caption, interleaved image-text, and text-only data…
Keep reading with a 7-day free trial
Subscribe to Last Week in AI to keep reading this post and get 7 days of free access to the full post archives.