Today's Guide to the Marketing Jungle from Social Media Examiner...
presented by
Friday is almost here, Alluser! Before you unplug, here's one last round of insights and updates. Whether you read them now or save them for later, you won't want to miss these.
In today's edition:
-
Reach your loyal followers on Instagram
-
The Dynamic Creative Protocol for running ads with Meta's Andromeda algorithm
-
🗞️ Industry news from Instagram, YouTube, Gemini, and more
What Early Access Reels Mean for Marketers
If you're focused on relationship-based growth or balancing reach with real connection, this update might be your moment. Especially if Instagram plays a major role in your content mix.
Instagram's internal data shows people now spend only 7% of their time on Instagram with content from accounts they follow. That means your most loyal audience likely isn't even seeing your posts.
But a new feature called Early Access Reels could change that—at least for a while.
This is a chance to rebuild the kind of visibility that makes your brand memorable. Where your face, your name, and your ideas don't just stay in the algorithm's favor, but in your audience's hearts and minds. Watch more here.
Facebook Ad Algorithm Changes for 2026: What Marketers Need to Know
Understanding Meta's Andromeda Algorithm
In late 2024, Meta introduced a completely new algorithm dubbed Andromeda, effectively replacing the old targeting systems advertisers relied on for years. Previously, advertisers controlled the experience by manually selecting audiences based on interests or behaviors. Andromeda flips this dynamic; now, Meta controls the targeting, using your ad creative to determine who should see your content.
Think of this shift like the evolution of Netflix recommendations. Originally, Netflix might have categorized viewers broadly—if you liked cooking shows, you saw a generic list of cooking shows. Today, the recommendation engine is far more sophisticated. It notices you watched a specific movie and creates a customized feed based on your unique viewing patterns.
Andromeda works similarly for ads. It uses deep learning and sequence learning to match your creative to a user's intent and position in the buyer's journey. For example, if the algorithm detects a user has just booked a ski resort reservation, it will intelligently serve ads for ski equipment, lift tickets, and winter clothing, anticipating their next needs rather than just reacting to a static interest category.
This algorithm is learning at an exponential rate. During beta testing in Q2, the Andromeda update delivered 5% growth in ad conversions on Instagram. By Q3, that efficiency improvement had doubled. This rapid evolution means sticking to old methods will leave you behind faster than ever before.
How to Set Up Meta Ads Creative to Work With Andromeda: The Dynamic Creative Protocol
To leverage the algorithm's ability to optimize, use a specific setup strategy. This method combines your top-performing assets into a single ad unit, allowing Meta to find the best combination for each user.
-
Select the Right Setting: At the ad set level, toggle on "Dynamic Creative" if you are optimizing for leads. If you are optimizing for sales, select "Flexible Creative" at the ad level.
-
Max Out Visuals: Upload as many high-performing images and videos as the tool allows. If you don't have enough videos, maximize your static images. Ideally, use proven winners rather than untested concepts.
-
Draft Text Variations: Add two variations of primary text—one short and one medium length.
-
Add Headlines: Input two of your top-performing headlines.
-
Enable AI Optimization with Caution: Turn on the generative AI features Meta offers within the setup (cropping, animating, etc.). Critical Warning: You must read through the AI-generated text variations before launching. If you are in a regulated industry like health or finance, the AI may create copy that violates compliance rules.
By feeding the system a mix of your best assets, you give Andromeda the ingredients it needs to serve the perfect ad to the right person at the right time.
Other topics discussed include:
-
When to Use Broad Targeting vs. Manual Targeting
-
How to Simplify Campaign Structure and Budget
-
How to Adjust Your Creative Library
-
How to Select Ad Placements
-
How to Analyze The Latest Meta Ads Metrics
-
Looking Forward to Meta's GEM Update
Today's advice is provided with insights from Tara Zirker, a featured guest on the Social Media Marketing Podcast and speaker at Social Media Marketing World.
OpenAI Launches Faster, Smarter Image Generation: OpenAI has introduced ChatGPT Images powered by GPT‑Image 1.5, delivering faster and more precise image creation with up to 4× improved generation speed and stronger instruction following. Users can now perform nuanced visual edits while maintaining key image details, making the tool ideal for both expressive design and practical tasks. A new ChatGPT sidebar interface offers presets and trending prompts to inspire creativity. Available to all ChatGPT users and through the API, GPT‑Image 1.5 is also 20% cheaper than its predecessor, supporting faster iteration across creative, marketing, and e-commerce workflows. OpenAI
Gemini Deep Research Now Creates Interactive Visual Reports: Google has upgraded Gemini Deep Research to generate visual reports that blend deep analysis with custom images, charts, and interactive simulations. Available to AI Ultra subscribers, this new capability helps users turn complex information into clear, engaging insights—whether planning budgets or modeling scientific theories. By integrating visuals and simulations directly into reports, Gemini transforms dense data into tangible, decision-ready outputs. Users can begin by selecting "Deep Research" in the Gemini app. Google
Meta Unveils SAM Audio for Intuitive, Professional-Grade Sound Editing: Meta has launched SAM Audio, an AI model that enables users to isolate sounds from complex audio environments using text, visual, and time-based prompts. As part of the Segment Anything collection, SAM Audio introduces new ways to interact with sound, allowing creators to filter out background noise, extract vocals or instruments, and fine-tune audio with greater precision. This unified model supports applications across music, video, accessibility, and scientific research. Users can try the model in the Segment Anything Playground or download it today. Meta
Instagram Tests TV App for Watching Reels: Instagram is testing a new way to enjoy Reels on the big screen with the launch of Instagram for TV, now available on Amazon Fire TV devices in the US. The app offers channels curated by interests—such as music, sports, and travel—where Reels play automatically for shared, lean-back viewing. Users can add up to five accounts, and Instagram plans to introduce features like phone-as-remote control and shared feeds. Designed with safety in mind, content follows PG-13 guidelines and incorporates teen protections from the mobile experience. Instagram
YouTube Tests Image Posts in Shorts Feed: YouTube is experimenting with displaying image posts in the Shorts feed, giving creators a new way to share stories and engage audiences visually. Creators can include up to 10 images per post, with eligible content created through the "Create" button being featured in the test. YouTube is also exploring the addition of music to image posts, signaling further development of creative tools for the platform's short-form format. YouTube
What Did You Think of Today's Newsletter?
Michael Stelzner, Founder and CEO
P.S. Add
michael@socialmediaexaminer.com into your contacts list. Use Gmail?
Go here to add us as a contact.
We publish updates with links for our new posts and content from partners. Your information: Email:
tukangpostemel@gmail.com Opted in on: 2021-09-06 17:03:43 UTC.