Senior Data Engineer
Maine - Remote Office
Full Time Senior-level / Expert USD 117K - 156K
WEX
WEX is the global commerce platform for fuel and fleet, employee benefits, and business payments. Simplify your business and let WEX handle the complex.(*) This is a remote position; however, the candidate must reside within 30 miles of one of the following locations: Boston, MA; Chicago, IL; San Francisco Bay Area, CA; and Portland, ME.
About the Team/Role
We are the cloud data platform team at WEX. We empower businesses by providing cutting-edge data platforms and solutions that drive innovation and enhance decision-making. Our platform is built on a foundation of innovation, security, and scalability, and we are looking for a highly skilled and experienced Senior Data Engineer to join our dynamic team and help us continue to push the boundaries of what is possible.
As a Senior Data Engineer at WEX, you will drive the implementation and optimization of our cloud data platform. As a key member of our organization, you will play a critical role in designing and implementing industry-leading data solutions that meet our business's needs. You will also play a pivotal role in architecting and implementing data solutions that underpin our data platform.
Also, you will be at the forefront of driving transformative change in our data engineering capabilities. You will leverage your product expertise, along with your extensive experience in AWS, Snowflake, Big Data technologies, and DataOps, to design and implement cutting-edge data solutions. Your role will involve making design decisions that optimize performance, scalability, and data governance while meeting the unique requirements of WEX.
How you’ll make an impact
Lead the design and implementation of scalable, robust, and high-performance data pipelines, ETL processes, and analytics solutions using AWS services, Snowflake, and other Big Data technologies.
Understanding of Data Warehousing/Data Lake/Data Lake House architecture
Collaborate closely with business stakeholders, data scientists, and other cross-functional teams to understand their requirements and translate them into innovative data solutions.
Optimizing with the capability to do performance-tune complex SQL queries on Snowflake along with a deep understanding of cost optimization strategies in Snowflake.
Working with Data Analysts to ensure that all data feeds are optimized and available at the required times. This can include Change Capture, Change Data Control, and other “delta loading” approaches.
Implement solutions for data integration, data management, metadata management, data processing, data security, and data quality, including technical delivery review, resolution of design issues, and the migration of the existing on-premises applications to the data platform.
Improve data discovery and usability with data tagging, cataloging, and ensuring accuracy, consistency, and reliability of data.
Experience you’ll bring
Proficiency in data engineering
Strong expertise in designing and building scalable data pipelines, ETL processes, and analytics solutions using AWS services (e.g., S3, Glue, Lambda), ETL tools [ e.g., Informatica IDMC], and Snowflake.
Experience with increasing responsibilities implementing data solutions with a focus on Snowflake or similar warehouse
Good understanding of Cloud-based data solution components and architecture covering data ingestion, data processing, data cataloging, data optimization, security, DevOps, consumption, etc
Experience implementing core Snowflake concepts such as UDFs, zero-copy clones, time travel, micro partition, stored procedures, data import/export, and external tables focus on snowflake.
Experience with scripting language and governance tools (e.g., Alation, Collibra) will be an added advantage.
Exceptional communication and collaboration skills, with the ability to work effectively with senior stakeholders and cross-functional teams.
Extensive experience working with data related to the fintech industry, including knowledge of Rebates, commissions, and pricing and familiarity with how these processes integrate with other business operations such as sales and finance.
Knowledge of financial accounting principles relevant to rebates and commissions will be an added advantage.
Experience in software engineering with a strong background in software development, design patterns, and best practices.
Development experience with programming languages such as Java, Python, or Scala.
Strong understanding of code quality, version control, testing frameworks, and CI/CD pipelines
Required Qualifications:
Bachelor in Computer Science or Equivalent with 10 years of experience in Data Engineering and data warehousing or Master with 7 years worth of experience in Data Engineering and data warehousing.
7+ Years of experience in PowerCenter/Informatica cloud (IICS) – Expert level knowledge & Experience in CDI and CAI.
Experience in API-based integration and real-time integration.
Knowledge of other Informatica products is always a plus.
Tags: APIs Architecture AWS Big Data CI/CD Computer Science Data governance Data management DataOps Data pipelines Data quality Data Warehousing DevOps Engineering ETL Finance FinTech Informatica Java Lambda Pipelines Python Scala Security Snowflake SQL Testing
Perks/benefits: Career development Competitive pay Flex hours Flexible spending account Flex vacation Health care Insurance Salary bonus
More jobs like this
Explore more career opportunities
Find even more open roles below ordered by popularity of job title or skills/products/technologies used.