Data Engineer II
Monterrey Business Center I, Mexico
Johnson Controls
Applying data from both inside buildings and beyond, our customers can now manage operations systemically.What you will do?
Join us in the Procurement Execution Center (PEC) as a Data Engineer as part of a is a diverse team of data and procurement individuals. In this role, you will be responsible for deploying supporting the E2E management of our data, including: ETL/ELT, DW/DL, data staging, data governance, and manage the different layers of data required to ensure a successful BI & Reporting for the PEC. This role will work with multiple types of data, spreading across multiple functional areas of expertise, including Logistics, MRO & Energy, Travel, Professional Services, among others.
How you will do it?
- Deploy data ingestion processes through Azure Data Factory to load data models as required into Azure Synapse.
- Build and design ETL/ELT processes with Azure Data Factory (ADF) and/or Python, which once deployed, will require to be executed daily and weekly.
- Assemble large, complex data sets that meet functional / non-functional business requirements.
- Build the infrastructure required for optimal ETL/ELT of data from a wide variety of data sources using Azure SQL and ADF.
- Develop data models that enable DataViz, Reporting and Advanced Data Analytics, striving for optimal performance across all data models.
- Maintain conceptual, logical, and physical data models along with corresponding metadata.
- Manages the DevOps pipeline deployment model, including automated testing procedures
- Deploys data stewardship and data governance across our data warehouse, to cleanse and enhance our data, using knowledge bases and business rules.
- Performs the necessary data ingestion, cleansing, transformation, and coding of business rules to support annual Procurement bidding activities.
- Support the deployment of a global data standard for Logistics.
- Create data tools for analytics and data scientist team members that assist them in building and optimizing our product into an innovative industry leader.
- Support Rate Repository management as required (including Rate Card uploads to our DW).
- Other Procurement duties as assigned.
What are we looking for?
- Bachelor’s degree in related field (Engineering, Computer Science, Data Science or similar)
- 3+ years of relevant experience in BI Engineering, data modeling, data engineering, software engineering or other relevant roles. Strong SQL knowledge and experience working with relational databases.
- Knowledge in DW/DL concepts, data marts, data modeling, ETL/ELT, data quality/stewardship, distributed systems and metadata management.
- Experience building and optimizing data pipelines, architectures, and data sets.
- Azure Data Engineering certification preferred (DP-203)
- ETL/ELT development experience (3+ years). SSIS or ADF are preferred.
- Ability to resolve ETL/ELT problems by proposing and implementing tactical/Strategic solutions.
- Strong project management and Organizational skills.
- Experience with object-oriented function scripting languages: Python, Scala, C#, etc.
- Experience with NoSQL databases is a plus to support the transition from On-Prem to Cloud.
- Excellent problem solving, critical thinking, and communication skills
- Relevant experience with Azure DevOps (CI/CD, git/repo management) is a plus
- Due to the global nature of the role, proficiency in English language is a must
* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰
Tags: Architecture Azure CI/CD Computer Science Data Analytics Data governance Data pipelines Data quality Data warehouse DevOps Distributed Systems ELT Engineering ETL Git NoSQL Pipelines Python RDBMS Scala SQL SSIS Testing
Perks/benefits: Team events
More jobs like this
Explore more career opportunities
Find even more open roles below ordered by popularity of job title or skills/products/technologies used.