Replicate Intelligence #12

Posted August 23, 2024 by

Welcome to Replicate's weekly bulletin! Each week, we'll bring you updates on the latest open-source AI models, tools, and research. People are making cool stuff and we want to share it with you. Without further ado, here's our hacker-in-residence deepfates with an unfiltered take on the week in AI.

Editor's note

Earlier this week I turned into Hot Mark Zuckerberg. I just did it to test this deepfake library (which turned out to not even be open source after all but oh well) but now I feel weird about the future of Reality.

You know how Zuck has been rehabilitating his PR image? He grew out his hair and started dressing cool. And there was an image that went viral where they used a filter to give him a beard.

Well, I have curly hair and a beard, so I used it as a test image. I said "It's me, Hot Mark Zuckerberg" and posted the video on X as a gag. A lot of people engaged with it -- and to my surprise, many of them thought it was real.

So I'm getting these replies from people who think it's just a video of Mark Zuckerberg, and I start to wonder, do I look that much like him? And I look in the mirror and for a second I don't recognize myself. Shouldn't I look... more like Zuck?

As I wrote last week, I believe we're headed for a multiverse of virtual realities, synthetic worlds generated in parallel and populated with digital beings. We're going to have millions of 'verses, each of which will feel real to the beings that inhabit it.

A lot of people are going to lose track of what is real and what is not. Arguably we already have: echo chambers, filter bubbles, video games, social media, the list goes on. We're in the dreamtime now, a shifting blurring mess of hallucinated potentialities.

But then, maybe we already were. A preprint paper this week (see Research Radar) suggests that dreams act as a form of synthetic data, the brain generating virtual worlds and scenarios to improve learning and decision making. Dreams as a form of self-play.

The worlds we dream will inform our real world decisions. We'll be able to simulate, say, climate forecasts in different carbon scenarios, and model them more accurately and more vividly with every extra bit of data and compute. We are the model makers, and we are the dreamers of dreams.

What is reality? As ever, it is what we make it.

-- deepfates


FLUX.1: Exploring new frontiers in image generation

FLUX.1 continues to push the boundaries of what's possible in image generation. This week, we're highlighting two exciting developments:

  1. ReFlux: An interface that allows you to use multiple fine-tuned FLUX.1 models and generate images on an infinite canvas. You can explore various styles and concepts simultaneously, opening up new creative possibilities.

  2. Multi-LoRA Explorer: This tool lets you apply multiple LoRAs (Low-Rank Adaptations) in a single image, allowing for complex style combinations. Use it with LoRAs from Replicate, CivitAI, or Hugging Face.

You can use LoRAs to train image models on your own face, then combine them with other styles. On X this week we saw a great [tutorial thread by community member Justine Moore(https://replicate.fyi/wRzg7ta), check it out if you want to learn how.

ReFlux | Multi LoRA Explorer


Cool tools

Real-time deepfake with one photo

Deep-Live-Cam is an open-source project that makes it easy to deepfake a face from a photo onto the head of a person in a video. Supposedly it has real-time face swapping; I couldn't run it at a satisfying frame rate on a MacBook Pro.

If I succeed with real-time Hot Mark Zuckerberg, you will hear about it.

github


Research radar

Brains do self-play while we sleep

A recent study published in bioRxiv sheds light on how our brains might be "fine-tuning" themselves during sleep. The research suggests that during REM (Rapid Eye Movement) sleep, an animal's brain simulates actions and their consequences, creating a form of synthetic data for self-improvement.

Key findings:

  • The brain's internal representation of direction shifts during REM sleep, similar to an awake animal moving through its environment.
  • The superior colliculus (whatever that is) issues motor commands during REM sleep, despite the body'sphysical immobility.
  • These "virtual" motor commands shift the internal representation of direction as if the action had been performed.

There's some parallels here to how AI models are trained and fine-tuned. Perhaps our brains are using a form of synthetic data generation to enhance learning and decision-making capabilities.

paper


Community spotlight

Replicate mentioned on Lex Fridman podcast

Pieter Levels, founder of PhotoAI, Nomad List, and RemoteOK, recently appeared on the Lex Fridman podcast, where he discussed various topics including AI, viral startups, and the digital nomad lifestyle.

Levels gave a shoutout to Replicate, highlighting its usefulness for AI engineers and developers.

Some of what they talked about:

  • The growing importance of AI in modern software development
  • How tools like Replicate are making AI more accessible to developers

It's cool to hear how Replicate is empowering AI engineers and democratizing access to AI models.

podcast | youtube


Bye for now

I'll be gone next week. Expect your next newsletter in September. That is, if you believe I was ever real at all...

--- deepfates