Senior Data Architect & Engineer (Azure Databricks Platform)
NA-US-IA-Virtual Office, United States
Full Time Senior-level / Expert USD 126K - 158K
Corteva Agriscience
Corteva Agriscience™ is a publicly traded, global pure-play agriculture company that provides farmers around the world with the most complete portfolio in the industryWho are we, and what do we do?
Corteva Agriscience is the only major agriscience company in the world completely dedicated to agriculture. Our purpose is to enrich the lives of those who produce and those who consume, ensuring progress for generation to come. Our inspiration is to be an innovator, driving the next generation of agriculture products that help farms and farmers flourish and through partnering with society becoming the most trusted partner in the global agriculture and food community.
We are seeking a highly skilled Data Architect & Engineer to lead the design, development, and implementation of scalable data models and pipelines in Azure Databricks. This hybrid role bridges architecture and engineering, and is instrumental in building a high-performance enterprise data lakehouse supporting commercial, production, and finance domains. The platform will serve as the foundation for data-driven decisions, advanced analytics, and AI model development across the organization.
What You’ll Do:
Architecture & Data Modeling
- Design scalable and maintainable data models across commercial, production, and finance domains.
- Define and enforce enterprise-wide data architecture standards, naming conventions, and data modeling best practices.
- Collaborate with domain experts, analysts, and business leaders to translate data requirements into logical models.
Engineering & Implementation
- Build and optimize data pipelines in Databricks using PySpark, SQL, Delta Lake, and delta live tables.
- Implement data transformation logic (ELT) to curate clean, trusted, and high-performance data layers.
- Develop data products using Unity Catalog, Alation, data asset bundle and Gitlab CI/CD workflows.
- Ensure query optimization, data quality, and high availability of data pipelines.
Platform Management
- Manage and orchestrate workflows using Databricks Workflows, Azure Data Factory, or equivalent tools.
- Integrate structured and unstructured data from diverse sources (e.g., ERP, CRM, IoT, APIs) into the lakehouse.
- Ensure platform security, governance, and compliance using Unity Catalog, RBAC, and lineage tools.
What Skills You Need:
- 5+ years of experience in data engineering, data architecture, or enterprise analytics roles.
- Strong knowledge of modern enterprise data architectures, (including data warehouses, data lakehouses, data fabric, and data mesh,) with an understanding of their trade-offs.- Hands-on experience with Databricks on Azure, including Delta Lake table and Unity Catalog.
- Proven expertise in data modeling (dimensional) and pipeline development (batch, stream) for cross-functional enterprise data.
-Proven experience with big data environments, familiar with modern data formats (e.g., Parquet, Avro) and open table formats (e.g., Delta Lake, Apache Iceberg
- Proficient in SQL, PySpark, dbt and Kafka for data engineering and transformation workflows.
- Deep understanding of Azure ecosystem (e.g., ADLS Gen2, Synapse, ADF, Key Vault).
- Experience with version control and CI/CD practices for data projects.
Preferred Qualifications:
- Background in data integration for commercial, operations, and financial domains.
- Knowledge of data governance, observability, and cataloging tools (e.g., Alation).
- Experience optimizing distributed data processing and cost/performance tradeoffs.
- Familiarity with regulatory compliance in enterprise data environments.
Soft Skills:
- Excellent collaboration and communication skills across business and technical teams.
- Comfortable in agile, fast-paced, and highly accountable environments.
- Able to translate complex data problems into practical, scalable solutions.
Emerging Technologies & Practices:
- Apply modern architectural patterns such as Autonomous Data Products to create self-contained, discoverable, and reusable data components with defined ownership and SLAs.
- Drive adoption of data contracts, data observability, and lineage tracing to enhance reliability and governance across data domains.
- Evaluate and implement Lakehouse federation patterns and data mesh principles for scaling across global teams and business units.
- Champion the integration of semantic layers, feature stores, and time-travel auditing to support both business intelligence and machine learning use cases.
#LI-BB1
Benefits – How We’ll Support You:
Numerous development opportunities offered to build your skills
Be part of a company with a higher purpose and contribute to making the world a better place
Health benefits for you and your family on your first day of employment
Four weeks of paid time off and two weeks of well-being pay per year, plus paid holidays
Excellent parental leave which includes a minimum of 16 weeks for mother and father
Future planning with our competitive retirement savings plan and tuition reimbursement program
Learn more about our total rewards package here - Corteva Benefits
Check out life at Corteva! www.linkedin.com/company/corteva/life
Are you a good match? Apply today! We seek applicants from all backgrounds to ensure we get the best, most creative talent on our team.
This reflects a reasonable estimate of the targeted base salary for this role. This role is also eligible for an annual bonus. Based on factors such as geographic location and candidate qualifications, actual base pay is determined when an employment offer is made.
Corteva Agriscience is an equal opportunity employer. We are committed to embracing our differences to enrich lives, advance innovation, and boost company performance. Qualified applicants will be considered without regard to race, color, religion, creed, sex, sexual orientation, gender identity, marital status, national origin, age, military or veteran status, pregnancy related conditions (including pregnancy, childbirth, or related medical conditions), disability or any other protected status in accordance with federal, state, or local laws.
Tags: Agile APIs Architecture Avro Azure Big Data Business Intelligence CI/CD Databricks Data governance Data pipelines Data quality dbt ELT Engineering Finance GitLab Kafka Machine Learning ML models Parquet Pipelines PySpark Security SQL Unstructured data
Perks/benefits: Career development Competitive pay Health care Medical leave Parental leave Salary bonus
More jobs like this
Explore more career opportunities
Find even more open roles below ordered by popularity of job title or skills/products/technologies used.