Meta releases Ax 1.0 for automated machine learning optimization
Meta launches Ax 1.0, an open-source platform using Bayesian optimization to automate complex experimentation across AI development, infrastructure tuning, and hardware design.
Meta released version 1.0 of Ax, an open-source adaptive experimentation platform, on November 18, 2025. The platform employs machine learning to guide resource-intensive experimentation processes, addressing optimization challenges in AI model development, production infrastructure tuning, and hardware design.
The release coincides with publication of a paper titled "Ax: A Platform for Adaptive Experimentation" in AutoML 2025's ABCD Track. The paper details Ax's architecture, methodology, and performance comparisons against competing black-box optimization libraries including SyneTune, Optuna, Vizier, SMAC3, and HEBO.
According to the engineering blog post announcing the release, Ax addresses a fundamental challenge in modern AI development: understanding and optimizing systems with vast configuration possibilities. Single configuration evaluations often require substantial computational resources and time, making efficiency critical for researchers and developers.
The platform uses Bayesian optimization, an iterative approach that balances exploration of new configurations against exploitation of previously successful ones. Ax relies on BoTorch for implementing Bayesian optimization components. The system builds surrogate models from candidate configuration evaluations, identifies promising configurations for sequential testing, and repeats until finding optimal solutions or exhausting experimental budgets.
Under typical settings, Ax employs Gaussian processes as surrogate models during optimization loops. These flexible models make predictions while quantifying uncertainty, proving especially effective with limited data points. The platform uses acquisition functions from the expected improvement family to suggest next candidates by calculating expected value of new configurations compared to previous best results.
Meta applies Ax across multiple disciplines. Traditional machine learning tasks include hyperparameter optimization and neural architecture search. GenAI applications involve discovering optimal data mixtures for training large language models. Production infrastructure teams use Ax for tuning compiler flags, while physical engineering teams optimize design parameters for augmented and virtual reality devices.
The platform recently enabled design of faster-curing, low-carbon concrete mixes deployed at Meta data center construction sites. These mixtures support the company's net zero emissions goal for 2030.
Ax provides sophisticated analysis capabilities beyond configuration optimization. The platform generates plots and tables showing optimization progress over time, Pareto frontiers illustrating tradeoffs between metrics, parameter effect visualizations across input spaces, and sensitivity analyses quantifying each parameter's contribution to results. These tools enable experimenters to understand underlying systems while identifying optimal configurations for production deployment.
The paper published in AutoML 2025 demonstrates Ax's performance across synthetic and real-world black-box optimization tasks. Reviewers described the paper structure as exceptional, noting its clarity in explaining main features and advantages compared to other libraries. The API overview and code examples provide comprehensive summaries for users implementing the system.
Experimentation complexity extends beyond sophisticated machine learning methods. Production deployments require specialized infrastructure for managing experiment state, automating orchestration, providing analysis and diagnostics. Goals typically involve balancing multiple objective metrics subject to constraints and guardrails rather than optimizing single metrics.
Ax enables users to configure dynamic experiments using state-of-the-art techniques while providing robust infrastructure for integrating cutting-edge methods into production systems. The platform accommodates problems with hundreds of tunable parameters and outcomes, where higher-dimensional settings make surrogate-based approaches particularly valuable compared to alternatives.
Meta's deployment scale includes thousands of developers using Ax for tasks like hyperparameter optimization, architecture search for AI models, tuning parameters for online recommender and ranking systems, infrastructure optimizations, and simulation optimization for hardware design. Multi-objective optimization capabilities enable simultaneous improvement of machine learning model accuracy while minimizing resource usage.
When researchers needed to shrink natural language models for first-generation Ray-Ban Stories, they used Ax to search for models optimally trading off size and performance. Engineers apply constrained optimization techniques for tuning recommender systems, optimizing key metrics while avoiding regressions in others.
The platform addresses problems where system quality depends on parameters with complex interactions requiring experimentation, and where experimentation carries meaningful costs. Ax employs data-driven approaches to adapt experiments as they unfold, solving these problems efficiently.
The release uses MIT licensing, inviting practitioner and research communities to contribute improved surrogate models, acquisition functions, research application extensions, bug fixes, and core capability improvements. The development team accepts contributions through GitHub Issues.
Buy ads on PPC Land. PPC Land has standard and native ad formats via major DSPs and ad platforms like Google Ads. Via an auction CPM, you can reach industry professionals.
The advertising industry has increasingly adopted machine learning optimization across major platforms. Amazon enhanced Sponsored Display campaign optimization in April 2025, aligning Reach and Page Visit campaigns with conversion objectives through machine learning algorithm improvements. Google introduced AI-powered budget recommendations through its Ads API in October 2024, leveraging machine learning models to analyze campaign setup data.
Meta's advertising business demonstrates AI optimization impact. The company reported 22% advertising revenue growth to $46.6 billion in Q2 2025, with Generative Ads Recommendation System improvements increasing ad conversions by approximately 5% on Instagram and 3% on Facebook.
The AutoML 2025 paper received acceptance recommendations from all reviewers. One reviewer highlighted the well-thought-out structure and emphasis on practical library usage. Another noted the paper serves as a prototype for library papers, describing it as technically flawless with major impact and strong evaluation.
Reviewers requested additional information about feature limitations and unsupported capabilities present in alternative frameworks. Multi-fidelity optimization appears in progress according to the paper's future work section, but isn't documented in the feature comparison table.
The reproducibility review confirmed the submitted code is well-documented with easy-to-follow installation instructions. The only minor hurdle involved installing cocoex requirements on machines without C or GCC compilers, which reviewers fixed easily.
Benchmark testing used BBOB (Black-Box Optimization Benchmarking) problems and ZDT/DTLZ multi-objective test suites. Experiments ran with budgets of 50 or 100 evaluations, though reviewers noted this budget information should appear in the main paper rather than appendix for proper result interpretation.
Google reduced incrementality testing budget requirements to $5,000 in May 2025 using Bayesian methodology similar to Ax's approach. The statistical framework incorporates prior knowledge about expected outcomes before analyzing experimental data, enabling meaningful conclusions with smaller sample sizes compared to traditional frequentist approaches.
Ax's open-source availability contrasts with proprietary optimization platforms. Google released an open-source Model Context Protocol server for its Ads API in October 2025, enabling AI tools to query advertising campaigns. The open-source approach accelerates adoption while allowing community contributions to influence development priorities.
Installation requires running pip install ax-platform and visiting the Ax website for quickstart guides, tutorials, and methodology deep dives. The platform supports Python environments with comprehensive documentation for integration into existing workflows.
The Adaptive Experimentation team created Ax: Sebastian Ament, Eytan Bakshy, Max Balandat, Bernie Beckerman, Sait Cakmak, Cesar Cardoso, Ethan Che, Sam Daulton, David Eriksson, Mia Garrard, Matthew Grange, Carl Hvarfner, Paschal Igusti, Lena Kashtelyan, Cristian Lara, Ben Letham, Andy Lin, Jerry Lin, Jihao Andreas Lin, Samuel Müller, Miles Olson, Eric Onofrey, Shruti Patel, Elizabeth Santorella, Sunny Shen, Louis Tiao, and Kaiwen Wu.
Future development focuses on building new features for innovative experiment designs, optimization methods, and external platform integrations. The team continues improving Ax by extending capabilities for individual research applications that benefit the larger community.
Marketing professionals benefit from understanding these optimization techniques as platforms increasingly automate campaign management through similar methods. Meta's Advantage+ sales campaigns already boost return on ad spend by an average of 22%, powered by the same machine learning infrastructure underlying Ax.
The platform addresses optimization problems characterized by complex parameter interactions requiring experimentation with meaningful costs. As experimentation and testing capabilities expand across advertising platforms, understanding adaptive experimentation methodology becomes increasingly relevant for marketing professionals managing campaigns.
Subscribe PPC Land newsletter ✉️ for similar stories like this one
Timeline
- November 18, 2025: Meta releases Ax 1.0 with accompanying AutoML 2025 paper
- June 3, 2025: AutoML 2025 ABCD Track accepts Ax platform paper
- 2025: Meta applies Ax to design low-carbon concrete for data center construction
- 2023: Meta introduces Lattice unified model architecture, later reducing ads ranking models by approximately 100
- August 2022: Meta launches Advantage+ automated advertising campaigns
- October 2024: Google introduces AI budget recommendations in Ads API v18
- April 2025: Amazon enhances Sponsored Display optimization algorithms
- May 2025: Google reduces incrementality testing budget to $5,000 using Bayesian methods
- July 2025: Meta reports 22% ad revenue growth driven by AI optimization
- October 2025: Google releases open-source MCP server for Ads API
- November 2025: Meta enhances value optimization delivering 29% higher ROAS
Subscribe PPC Land newsletter ✉️ for similar stories like this one
Summary
Who: Meta's Adaptive Experimentation team released Ax 1.0, an open-source platform used by thousands of Meta developers and now available to the broader research and practitioner communities under MIT licensing.
What: Ax employs Bayesian optimization and machine learning to automate complex experimentation processes, using Gaussian processes as surrogate models and expected improvement acquisition functions to identify optimal configurations across systems with hundreds of tunable parameters.
When: Meta released Ax 1.0 on November 18, 2025, coinciding with publication of the platform's technical paper in AutoML 2025's ABCD Track, following years of internal deployment and development at Meta.
Where: The platform operates across Meta's infrastructure for AI model development, production system tuning, and hardware design, with applications ranging from hyperparameter optimization to concrete mixture design for data center construction supporting net zero emissions goals.
Why: Ax addresses efficiency challenges in resource-intensive experimentation where single configuration evaluations require substantial time and computational resources, enabling researchers and developers to optimize complex systems with vast configuration possibilities while understanding underlying parameter interactions through comprehensive analysis tools beyond simple optimization outputs.