Senior SAP ETL Developer
Anywhere, U.S.A., United States
Carlisle Companies Incorporated
Carlisle Weatherproofing Technologies (CWT) is a leading supplier of building envelope solutions that effectively drive energy efficiency and sustainability in commercial and residential applications. We are looking for a Senior SAP ETL Developer to join our IT team working remotely.
Job Summary:
The Senior SAP ETL Developer will be responsible for leading the integration of SAP data—primarily from SAP ECC and SAP S/4 HANA—into our unified, cloud-based Enterprise Data Platform (EDP). This role requires deep expertise in SAP data structures, combined with strong experience in enterprise ETL development using cloud-native technologies.
As a Senior SAP ETL Developer, you will play a key role in designing and implementing scalable data pipelines that extract, transform, and harmonize data from SAP systems into canonical models for analytics, reporting, and machine learning use cases. You will partner closely with data engineers, architects, and SAP subject matter experts to ensure accuracy, performance, and alignment with business requirements.
This role will support a variety of high-impact projects focused on enabling cross-ERP visibility, operational efficiency, and data-driven decision-making across finance, manufacturing, and supply chain functions. Your contributions will help standardize critical datasets and accelerate the delivery of insights across the organization.
This is a remote position open to candidates based in Canada or the United States.
Duties and Responsibilities:
- SAP Data Integration (70%):
- Architect and implement robust ETL pipelines to extract data from SAP ECC, SAP S/4 HANA, SAP HANA, and SAP Datasphere using best-practice integration methods (e.g., ODP, CDS views, RFCs, BAPIs).
- Analyze and interpret SAP’s internal data models (e.g., tables like BKPF, BSEG, MARA, EKPO) and work closely with SAP functional and technical teams.
- Work with SAP Datasphere (SAP Data Warehouse Cloud) to federate or replicate SAP data for consumption in the EDP (highly desired).
- Review or interpret ABAP logic when necessary to understand legacy transformation rules and business logic (nice to have).
- Lead the end-to-end data integration process from SAP ECC, ensuring deep alignment with the EDP’s design and downstream data usage needs.
- Leverage knowledge of HANA Data Warehouse and SAP BW to support historical reporting and semantic modeling.
- Integrate data from non-SAP ERP systems such as JD Edwards and other legacy systems into a unified data platform.
- Transform and model data for analytics within Microsoft Fabric, including OneLake, Dataflows Gen2, and Synapse Data Warehouse.
- Establish data quality, security, and governance standards within integration workflows.
- Document technical processes and contribute to the ongoing improvement of data integration frameworks.
- Stay current on SAP and Microsoft data ecosystem developments to inform future architecture and tools.
- Enterprise ETL and Cloud Data Engineering (30%):
- Design and build robust, scalable ETL/ELT pipelines to ingest data into Microsoft cloud using tools such as Azure Data Factory, or Alteryx.
- Automate data movement from SAP into Azure Data Lake Storage / OneLake, enabling clean handoffs for consumption by Power BI, data science models, and APIs.
- Build data transformations in SQL, Python, and PySpark, leveraging distributed compute (e.g., Synapse or Spark pools).
- Work closely with cloud architects to ensure integration patterns are secure, cost-effective, and meet performance SLAs.
- Data Quality and Governance:
- Establish and enforce data quality standards and governance practices to ensure data integrity and consistency across integrated systems.
- Monitor, troubleshoot and address data quality issues, implementing solutions as needed.
- Collaboration and Communication:
- Collaborate with data modelers to define canonical enterprise models and develop mappings from SAP source tables.
- Work closely with cross-functional teams, including data analysts, business analysts, and ERP administrators, to understand data requirements and deliver solutions.
- Provide technical expertise and guidance to team members and stakeholders regarding data transformation and integration.
- Drive pragmatic approaches to solve complex business problems through providing data models suited for business intelligence / analytics tools such as PowerBI.
- Work with engineering teams to enable the appropriate capture and storage of data.
Knowledge/Skills/Abilities:
- Proven track record working with SAP tables, modules, and transactional data across finance, supply chain, procurement, and production planning.
- Strong proficiency in one or more enterprise-grade ETL tools (Azure Data Factory, Informatica, Alteryx, SSIS).
- Proficient in SQL for data transformation and orchestration.
- Experience building data pipelines at scale, including partitioning, parallelization, error handling, and monitoring.
- Expert knowledge of data modeling, data warehousing, and big data technologies.
- Understanding of data privacy regulations and best practices related to storing PII.
- Experience with dimensional model design (star schema), Kimball DW concepts, conversion of data stored in 3NF to denormalized form geared for reporting.
- Proven track record of delivering significant business impact, with a solid track record in Finance, Supply Chain, Sales or other verticals.
Education and Experience:
- Required:
- Bachelor’s degree in Computer Science, Data Science, Information Systems, or a related field.
- Relevant certifications (e.g., Microsoft Certified: Azure Data Engineer, Azure Data Scientist, Snowflake: SnowPro Data Engineer) are a plus.
- 10-15 years of IT experience, with at least 8-10 years of SAP experience (SAP ECC and SAP S/4HANA).
- Hands-on experience with Azure cloud data services including Synapse Analytics, Data Lake Storage, SQL DB.
- Experience building cloud-native applications, for example with Microsoft Azure, AWS or GCP
- Preferred:
- Experience with tools such as Azure Data Factory, Informatica, Alteryx, PySpark, and Python.
- Experience working with SAP Datasphere, SAP Business Data Cloud, SAP Data Services
- Experience with Microsoft Fabric
Working Conditions:
- Office setting or dedicated workspace in a home office setting
- Access to high-speed internet connection to facilitate video calls, project management.
- Typical work hours will be 8pm – 5pm with occasional hours worked outside of this time to meet project deadlines
- Minimal travel, roughly 5% will be required for this job to attend company onsite meetings, conferences or industry events.
#LI-MW1
* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰
Tags: APIs Architecture AWS Azure Big Data Business Intelligence Computer Science Data pipelines Data quality Data warehouse Data Warehousing ELT Engineering ETL Finance GCP Informatica Machine Learning Model design Pipelines Power BI Privacy PySpark Python Security Snowflake Spark SQL SSIS
Perks/benefits: Conferences Team events
More jobs like this
Explore more career opportunities
Find even more open roles below ordered by popularity of job title or skills/products/technologies used.