Meta to spend up to $33 billion on AI, as Zuckerberg pledges open approach to LLMs

AI recommendations are driving monetisation says Zuck, as Meta turns a corner

Meta to spend up to $33 billion on AI, as Zuckerberg pledges open approach to LLMs

Meta plans to spend $33 billion this year to support “ongoing build-out of AI capacity” Chief Financial Officer Susan Li said on a Q1 earnings call. The company’s shares jumped 10% after it beat earnings expectations.

The unexpected recovery at Meta followed strong ad revenue performance, which hit $28.1 billion for its “family of apps” segment (Facebook, Instagram, Messenger, and WhatsApp); up 7% on a constant currency basis.

Executives attributed this to increased use of AI recommendations.

Across Instagram about 40% of the content that users see is now AI recommended, rather than based on people that they follow. (The figure is 20% for Facebook, Meta executives revealed on the April 26 earnings call.)

Don't miss out: Follow The Stack on LinkedIn

That has not driven people away. Rather – particularly since Meta launched Reels, its short-form videos – it’s helped drive a 24% increase in time spent on Instagram, executives suggested, with some caveats.

Li added: "Our in-feed recommendations certainly go well beyond Reels. They cover all types of content, including text, images, links, group content, et cetera. And on Facebook, we see that AI-driven recommendations are continuing to grow and contribute to increasing engagement on the app. We're not quantifying this, but it's a place where I think we are both pleased with our progress and see significant opportunity for us to do better.

(Meta's net income meanwhile fell 24% to $5.7 billion for the quarter, largely on charges in the wake of three rounds of restructuring including significant layoffs earlier this year. Costs and expenses were $24.4 billion. Its CapEx target is $30 billion to $33 billion for 2023, largely, as highlighted above, to support AI capacity buildout.)

Meta AI use: Agents for billions of people...

CEO Mark Zuckerberg told analysts: “I think that there's an opportunity to introduce AI agents to billions of people in ways that will be useful and meaningful." (Meta now over three billion people using at least one of its apps on a daily basis -- as of March 2023 -- and approximately 3.8 billion people using at least one monthly.)

“We're exploring chat experiences and WhatsApp and messenger, visual creation tools for posts and Facebook and Instagram and ads, overtime video and multimodal experiences as well” he added on the earnings call.

“I expect that these tools will be valuable for everyone from regular people to creators to businesses.”

Paid use of WhatsApp Business for enterprise messaging has grown by 40% quarter-over-quarter.

And Zuckerberg said that augmenting this with AI will grow it further: “I expect that a lot of interest in AI agents for business messaging, and customer support will come once we nail that experience.” he added.

Meta AI investment: We'll be more open, says Zuck

Meta open-sourced LLaMA, a set of large language models ranging from 7 billion to 65 billion parameters in February under a non-commercial research licence, spawning a host of bespoke LLMs powered by it. LLaMA was released under a research-only licence, but Zuckerberg suggested that it was exploring a more open approach.

Zuckerberg told analysts that this approach would be to Meta’s benefit: “Right now most of the companies that are training large language models have business models that lead them to a closed approach to development.

“I think that there's an important opportunity in the industry to help create an open ecosystem.

“If we can help be a part of this, then much of the industry, I think, will standardize on using these open tools and help improve them further. So this will make it easier for other companies to integrate with our products and platforms as we enable more integrations. That will help our product stay at the leading edge as well.”

("LLaMA is a model that we only really made available to researchers... lot of the work that we're doing, I think we would aspire to and hope to make even more open than that. So we'll need to figure out a way to do that" he added during a Q&A.)

Meta noted at the time of LLaMA’s release that, “unlike [DeepMind’s] Chinchilla, [Google’s] PaLM, or [OpenAI’s] GPT-3, we only use publicly available data, making our work compatible with open-sourcing, while most existing models rely on data which is either not publicly available or undocumented…”

Pressed by analysts if the future involved ChatGPT-like Meta AI APIs for business integration, Mark Zuckberg said: "I view this more as a kind of back-end infrastructure advantage with potential integrations on the product side, but one that should hopefully enable us to stay at the leading edge and integrate more broadly with the community and also make the way we run all this infrastructure more efficient over time."

Pointing to the company's work releasing PyTorch and Open Compute (respectively, a machine learning library and a hardware optimisation project) he said that both had helped, after their public release, to "incorporate both innovation and scale efficiency into our own infrastructure. Our incentives, I think, are basically aligned towards moving in this direction. Now that said, there's a lot to figure out, right? So I mean you asked if there are going to be other opportunities. I hope so. I can't speak to what all those things might be now... The better we do with the foundational work, the more opportunities, I think, that will come and present themselves."

Pssssst, CTOs: Free ChatGPT alternatives are landing