Procurement Data Engineer III

Monterrey Business Center

Johnson Controls

Applying data from both inside buildings and beyond, our customers can now manage operations systemically.

View all jobs at Johnson Controls

Johnson Controls is a global diversified technology and multi industrial leader serving a wide range of customers in more than 150 countries. Our 130,000 employees create intelligent buildings, efficient energy solutions, integrated infrastructure and next generation transportation systems that work seamlessly together to deliver on the promise of smart cities and communities. Our commitment to sustainability dates back to our roots in 1885, with the invention of the first electric room thermostat. We are committed to helping our customers win and creating greater value for all of our stakeholders through strategic focus on our buildings and energy growth platforms. For additional information, please visit www.johnsoncontrols.com or follow us @johnsoncontrols on Twitter.

What you will do?

Join us in the Procurement Execution Center (PEC) as a Data Engineer as part of a is a diverse team of data and procurement individuals. In this role, you will be responsible for deploying supporting the E2E management of our data, including: ETL/ELT, DW/DL, data staging, data governance, and manage the different layers of data required to ensure a successful BI & Reporting for the PEC. This role will work with multiple types of data, spreading across multiple functional areas of expertise, including Fleet, MRO & Energy, Travel, Professional Services, among others.

How you will do it?

  • Serve as the main technical resource for any data-related requirement

  • Demonstrate an ability to communicate technical knowledge through project management and contributions to product strategy

  • Deploy data ingestion processes through Azure Data Factory to load data models as required into Azure Synapse.

  • Build and design complex ETL/ELT processes with Azure Data Factory (ADF) and/or Python, which once deployed, will require to be executed daily and weekly.

  • Assemble large, complex data sets that meet functional / non-functional business requirements.

  • Build the infrastructure required for optimal ETL/ELT of data from a wide variety of data sources using Azure SQL and ADF.

  • Develop data models that enable DataViz, Reporting and Advanced Data Analytics, striving for optimal performance across all data models.

  • Maintain conceptual, logical, and physical data models along with corresponding metadata.

  • Manages the DevOps pipeline deployment model, including automated testing procedures

  • Deploys data stewardship and data governance across our data warehouse, to cleanse and enhance our data, using knowledge bases and business rules.

  • Ensure compliance to system architecture, methods, standards, practices and participate in their creation

  • Clearly articulate and effectively influence both business and technical teams

  • Performs the necessary data ingestion, cleansing, transformation, and coding of business rules to support annual Procurement bidding activities. 

  • Support the deployment of a global data standard for Logistics.

  • Create data tools for analytics and data scientist team members that assist them in building and optimizing our product into an innovative industry leader.

  • Support Rate Repository management as required (including Rate Card uploads to our DW).

  • Other Procurement duties as assigned.

What are we looking for?

  • Bachelor’s degree in related field (Engineering, Computer Science, Data Science or similar)

  • 4+ years of relevant experience in BI Engineering, data modeling, data engineering, software engineering or other relevant roles.

  • Advanced working SQL knowledge and experience working with relational databases.

  • Knowledge in DW/DL concepts, data marts, data modeling, ETL/ELT, data quality/stewardship, distributed systems and metadata management.

  • Experience building and optimizing data pipelines, architectures, and data sets.

  • Azure Data Engineering certification preferred (DP-203)

  • ETL/ELT development experience (3+ years). SSIS or ADF are preferred.

  • Ability to resolve ETL/ELT problems by proposing and implementing tactical/Strategic solutions.

  • Strong project management and organizational skills.

  • Experience with object-oriented function scripting languages: Python, Scala, C#, etc.

  • Experience with NoSQL databases is a plus to support the transition from On-Prem to Cloud.

  • Excellent problem solving, critical thinking, and communication skills

  • Relevant experience with Azure DevOps (CI/CD, git/repo management)

  • Due to the global nature of the role, proficiency in English language is a must.

Johnson Controls does not request pregnancy or HIV testing as a requirement for admission, permanence or promotion.

* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰

Job stats:  0  0  0
Category: Engineering Jobs

Tags: Architecture Azure CI/CD Computer Science Data Analytics Data governance Data pipelines Data quality Data warehouse DevOps Distributed Systems ELT Engineering ETL Git Industrial NoSQL Pipelines Python RDBMS Scala SQL SSIS Testing

Perks/benefits: Team events

More jobs like this