- The 79
- Posts
- Google builds human-like memory into AI
Google builds human-like memory into AI
Hello guys! We hope you had a nice weekend! Here’s what you need to know about AI today:
👉 OpenAI’s o3-mini is coming
👉 Google adds human-like memory to AI
👉 Millennials are losing their jobs to AI
and many more!
📧 Did someone forward you this email? Subscribe here for free to get the latest AI news everyday!
Read time: 4.9 minutes
OPENAI
o3-mini is coming in a couple of weeks
Source: X
What’s going on: OpenAI CEO, Sam Altman, has confirmed the upcoming launch of the o3-mini AI model, expected within weeks. This model follows the momentum of OpenAI's previous releases, notably building on the advanced reasoning capabilities tested with the o3 and o3 mini models announced last December. The o3-mini will be able to address more complex tasks than OpenAI’s previous models.
What does it mean: The launch of o3-mini will include both the API and the availability of the model on ChatGPT user interface, unlike the previous launches. This comes after OpenAI introduced its o1 AI models in September 2024, which focused on tackling harder queries with more processing time.
More details:
OpenAI’s o3 series of models are designed to excel in reasoning, coding, and mathematical tasks.
Recently, o3 models achieved an impressive 75.7% score on the ARC-AGI benchmark, a challenging test of general intelligence that had remained unbeaten for 5 years.
o3-mini is a cost-efficient alternative to the original o3 model. The main o3 model costs $1000+ just for solving a single task!
There are five reasons o3 models are so special. Program synthesis technique, sophisticated CoT (chains of thought) system, using an evaluator model to self-asses the answers, ability to execute its own CoTs as tools for adaptive problem-solving, and using deep learning-driven approaches during inference to evaluate and refine potential solutions to complex problems.
Google just made human-like memory in AI possible
Source: Pixabay
What’s going on: Google's new AI innovation, Titans, has shown far better capabilities than classic Transformer model (technology that made ChatGPT possible) by trying to mimic human thinking. It features a memory system that replicates human short-term and long-term memory, with the added ability to “forget” less relevant information. (Read Titan’s paper)
What does it mean: Unlike Transformers, which were first introduced in a paper by Google researchers called “Attention Is All You Need“ and rely heavily on attention mechanisms to manage data, Titans goes further by prioritizing information based on its surprise factor, much like how humans remember unexpected or significant events more vividly. This AI includes a neural long-term memory module which enhances its ability to retain and use information over time.
More details:
Early tests suggest Titans excels in tasks such as language modeling, forecasting time series, and even DNA modeling, thanks to its unique "surprise metric" for data prioritization.
This development could lead to AI systems that are more intuitive, adaptable, and capable of handling complex tasks with rich context.
Titans Architecture combines attention mechanisms with a neural long-term memory module and introduces a meta in-context memory that learns to memorize at test time.
For a technical deep-dive on Titans architecture, read this article.
💼 Perplexity has acquired “Read.cv”, a social media platform for professionals, which will begin to wind down operations with data export available until May 16.
📃 A new survey reveals that 38% of business leaders believe that Millennial careers, especially those in administrative (57.1%) and customer support roles (46.1%), are at risk due to AI advancements.
💰 China has established an $8.2 billion AI investment fund to counter tightened US trade controls and maintain its edge in the tech war.
📚 A new paper presented in NeurIPS conference reveals that top large language models struggle to accurately answer high-level “history” questions, with the best-performing model achieving only about 46% accuracy.
🧬 OpenAI is collaborating with Retro Biosciences, a longevity startup, to extend human life by training an AI model called GPT-4b micro to re-engineer proteins that can turn human skin cells into stem cells.
🤖 Perplexity AI, Amazon-backed conversational AI search engine, has made a bid to merge with TikTok's US operations, aiming to become a serious search rival to Google.
🚁 Amazon has suspended US drone deliveries after 2 drones crashed in rainy weather at a testing facility, pending a software update to its drone fleet.
OpenAI has introduced a new feature for ChatGPT allowing users to customize its personality by assigning traits like "chatty" and "Gen Z" for more personalized interactions.
Ask AI for business advice
Can you share some common hurdles faced by start-ups and how to overcome them? Please provide examples or real-world use cases as well. Whenever possible, link to the relevant resources.
Grok 2’s answer
Zeromask - Lead Computer Vision Engineer
Captions - Software Engineer, Video Processing
Freed - Fullstack AI Engineer
WorldQuant - AI Scientist
Thank you for staying with us like always! If you are not subscribed, subscribe here for free to get more of these emails in your inbox! Cheers!