• The 79
  • Posts
  • AI simulates 500 million years of evolution

AI simulates 500 million years of evolution

Hi everyone! Here’s what you need to know about AI today:

👉 It would take the nature 500 million years to build this new AI-designed protein

👉 Alibaba released another top-performing enterprise LLM

👉 OpenAI launched ChatGPT Gov, a chatbot designed for the US government

and many more!

📧 Did someone forward you this email? Subscribe here for free to get the latest AI news everyday!

Read time: 5 minutes

SCIENCE

Simulating 500 million years of evolution with a language model

Source: EvolutionaryScale | An artist's depiction of esmGFP protein

What’s going on: Scientists at a company called EvolutionaryScale AI have used AI to create a new glowing protein named esmGFP, which they estimate would have taken nature about 500 million years to evolve. This protein shares only 58% sequence similarity with the nearest known natural protein in the same category, indicating a significant departure from existing biological structures. The generative AI model used in their study, known as ESM3, was trained on a vast dataset of protein sequences to predict and design new proteins.

What does it mean: The discovery of esmGFP not only shows the capabilities of AI in protein design but also opens up potential new avenues for applications in biotechnology, including medical imaging, biosensors, and gene expression studies. ESM3 is a tool for scientists to create proteins that can capture carbon emissions, enzymes that break down plastic, new medicines to cure diseases like cancer and Alzheimer’s.

More details:

  • Researchers estimate that achieving this sequence through natural mutations would require 96 specific changes, a scenario unlikely to occur in nature due to the vastness of biological timescales.

  • The rapid design of such proteins by AI could significantly accelerate research and development in various scientific fields by bypassing the slow pace of natural evolution.

ALIBABA

Another ground-breaking Chinese AI model

Source: Qwen blog | The red bars represent the performance of Qwen2.5-Max

What’s going on: Alibaba Cloud has introduced a new LLM, Qwen2.5-Max, this time challenging US tech giants in the enterprise AI space. This model has outperformed several leading AI models such as xAI’s Grok-1.5, Paris-based Mistral Large, Gemini 1.5 pro, Llama-3.1-405B, and the controversial DeepSeek-V3. In some benchmarks, it is on par with the latest models from Anthropic and Google, and a slightly behind Claude-3.5-Sonnet and OpenAI’s GPT-4o. (Read the full comparison on Qwen’s blog)

What does it mean: As we move forward, competition between China and US is getting more intense everyday. With the current speed of innovation in the field it is hard to exactly predict who will be the real winner of the so-called “AI war“. However, Chinese AI startups are showing off these days with releases of new game-changing models everyday. This release by Alibaba indicates Chinese tech companies have a broad strategy to assert themselves in the global AI market, particularly in response to restrictions on access to US technology.

More details:

  • Qwen2.5-Max achieved an impressive 89.4% score on Arena-Hard test, 9.5 on MT-Bench, and 87.7% on MMLU test.

  • Qwen2.5-Max uses mixture-of-experts architecture, which significantly enhances efficiency by potentially cutting infrastructure costs by 40-60%.

  • The API of Qwen2.5-Max (whose model name is qwen-max-2025-01-25) is available. You can first register an Alibaba Cloud account and activate Alibaba Cloud Model Studio service, and then navigate to the console and create an API key.

🏛️ OpenAI has launched ChatGPT Gov, a version of its AI chatbot specifically tailored for US government agencies, allowing them to use the platform with non-public, sensitive information within secure environments.

🤗 Hugging Face researchers are attempting to recreate DeepSeek-R1 AI reasoning model in a fully open-source format, aiming to make all components, including training data and code, publicly accessible.

🦆 Jack Dorsey, co-founder of Twitter, has launched Goose, a new, ultra-simple, open-source AI agent building platform from his startup Block.

🚀 Elon Musk has accused DeepSeek of lying about the number of GPUs they have used to train their latest model, R1. DeepSeek claims they have utilized only 2000+ Nvidia GPUs but Musk and many other experts believe they have actually benefited from more than 50,000 Nvidia H100 GPUs.

🔓 Italy, Australia, and US Navy are raising ethical and privacy concerns about the widespread usage of DeepSeek AI models. Italy has sent the first data watchdog request to DeepSeek claiming the data of millions of Italians is at risk.

🤖 SoftBank, the Japanese multi-national holding, is negotiating a $500 million investment in Skild AI, a robotics startup building a foundational model for robotics, valuing the company at $4 billion.

🤯 David Sacks, President Trump’s AI and crypto “czar”, claims there's substantial evidence that DeepSeek used OpenAI's models to train its own AI.

📈 Nvidia's stock began to recover on Tuesday after a massive drop triggered by DeepSeek's AI model announcement, which resulted in a nearly $600 billion loss in market cap for Nvidia.

AI + Parkinson’s law

Parkinson’s Law states that ‘work expands to fill the time available for its completion.’ Identify ways I can compress deadlines for my top [number] goals this week to force greater focus and efficiency. Include actionable tips to avoid burnout while maintaining high output.
My top [number] goals:
[your goals]

DeepSeek-R1’s answer | Thought process took 48 seconds

Mayo Clinic - Data Science Analyst

Thank you for staying with us like always! If you are not subscribed, subscribe here for free to get more of these emails in your inbox! Cheers!