Technical Program Manager, Gemini Pre-Training
London, UK
DeepMind
Artificial intelligence could be one of humanity’s most useful inventions. We research and build safe artificial intelligence systems. We're committed to solving intelligence, to advance science...About us
Artificial Intelligence could be one of humanity’s most useful inventions. At Google DeepMind, we’re a team of scientists, engineers, machine learning experts and more, working together to advance the state of the art in artificial intelligence. We use our technologies for widespread public benefit and scientific discovery, and collaborate with others on critical challenges, ensuring safety and ethics are the highest priority.
Snapshot
The Gemini program is pushing the boundaries of AI by developing a diverse range of models, with a critical focus on small models optimized for efficiency and accessibility. The On-Device Models Pre-training team is responsible for this crucial area, encompassing both on-device models designed for resource-constrained environments and the pre-training of open-source Gemma models. This team focuses on the initial phase of training these small models, ensuring they achieve optimal performance and efficiency for their intended use cases. This work is vital for enabling AI capabilities on a wide range of devices and fostering collaboration and innovation within the open-source community. As a Technical Program Manager, you will play a key role in ensuring the smooth and efficient execution of all small model pre-training projects, working closely with researchers, engineers, and other teams across Google DeepMind.
The role
As a Technical Program Manager on the On-Device Models Pre-training team, you will be a key contributor to the successful execution of pre-training runs for small models, including both on-device deployments and the open-source Gemma family of models. You will work closely with research scientists, engineers, and other TPMs to plan, track, and manage all aspects of these projects. This includes understanding the specific constraints and requirements of on-device environments, as well as the considerations for open-source releases (e.g., licensing, community engagement, documentation). You will manage dependencies and ensure that projects stay on schedule and within resource budgets. This role requires a strong technical understanding of machine learning, particularly as it relates to model compression, quantization, and optimization techniques, as well as an understanding of the principles of open-source software development. Excellent organizational skills and the ability to work effectively in a fast-paced, collaborative environment are essential.
Key responsibilities
- Program Planning & Execution: Contribute to the planning and execution of pre-training runs for all small models within the team's scope, encompassing both on-device deployments and the Gemma family of models. This includes developing detailed project plans, defining milestones, tracking progress, and managing dependencies.
- Risk Management: Identify potential risks and issues that could impact project timelines or success, particularly those related to small model performance, resource limitations, or open-source release requirements. Develop and implement mitigation strategies.
- Process Improvement: Identify opportunities to streamline the small model pre-training process and improve efficiency. Develop and implement best practices applicable to both on-device and Gemma model development. Contribute to the development and implementation of repeatable processes and automated workflows.
- Cross-Functional Collaboration: Work closely with research scientists, engineers, and other teams (e.g., hardware, software, legal, communications) to ensure alignment on goals, priorities, and dependencies. Facilitate clear communication and collaboration across these teams, including potential engagement with the open-source community for Gemma releases.
- Resource Coordination: Assist in managing and coordinating access to shared resources, such as specialized hardware, datasets, and software tools relevant to small model development.
- Status Reporting: Provide regular updates to stakeholders on project progress, highlighting key milestones, risks, and issues specific to small model pre-training.
- Documentation: Maintain clear and comprehensive documentation for processes, plans, and project status, including documentation suitable for open-source release of Gemma models.
- Program Closure: Contribute to the assessment of program viability and provide recommendations for closure when necessary, in the best interest of GDM.
- Team Support: Proactively share knowledge, best practices, and guidance with other team members to enhance overall team performance.
About you
In order to set you up for success as a Technical Program Manager at Google DeepMind, we look for the following skills and experience:
- Bachelor's degree in Computer Science, Engineering, or a related technical field.
- 5+ years of experience as a Technical Program Manager, Project Manager, or in a similar role involving the coordination and management of technical projects.
- Demonstrated experience managing complex technical projects with multiple stakeholders and dependencies.
- Strong understanding of project management methodologies and best practices.
- Solid technical understanding of machine learning concepts. Strong understanding of techniques and considerations specific to training small models, including model compression, quantization, and optimization, is highly desirable. Familiarity with the challenges and opportunities of both on-device model development and open-source model releases is crucial.
- Excellent communication, interpersonal, and presentation skills. Able to communicate technical information effectively to both technical and non-technical audiences, clearly articulating risks and opportunities to influence decision-making.
- Strong organizational skills and attention to detail.
- Ability to thrive in a fast-paced, dynamic, and collaborative environment.
- Proven ability to proactively identify and solve problems.
- Experience with managing projects involving specialized hardware or resource-constrained environments is a plus.
In addition, the following would be an advantage:
- Experience with small model pre-training and optimization techniques, particularly for on-device or open-source contexts.
- Experience with mobile or embedded systems development.
- Experience with open-source software development and community management.
- Experience with computing platforms (e.g., Google Cloud Platform).
- Familiarity with automation tools and scripting languages (e.g., Python).
- Experience with different project management approaches and a demonstrated ability to adapt to evolving project requirements.
Application deadline: 12pm GMT, 2nd April 2025
Note: In the event your application is successful and an offer of employment is made to you, any offer of employment will be conditional on the results of a background check, performed by a third party acting on our behalf. For more information on how we handle your data, please see our Applicant and Candidate Privacy Policy.
At Google DeepMind, we value diversity of experience, knowledge, backgrounds and perspectives and harness these qualities to create extraordinary impact. We are committed to equal employment opportunities regardless of sex, race, religion or belief, ethnic or national origin, disability, age, citizenship, marital, domestic or civil partnership status, sexual orientation, gender identity, pregnancy, or related condition (including breastfeeding) or any other basis as protected by applicable law. If you have a disability or additional need that requires accommodation, please do not hesitate to let us know.
* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰
Tags: Computer Science Engineering GCP Gemini Google Cloud Machine Learning ML models Open Source Privacy Python Research
Perks/benefits: Career development
More jobs like this
Explore more career opportunities
Find even more open roles below ordered by popularity of job title or skills/products/technologies used.