Principal Data Engineer
India - Bengaluru
Aptos
Unify your omnichannel experience and enterprise with Aptos' leading solutions. See why the world's top retailers trust Aptos ONE, POS, Merchandising and more.Making a career change is a big decision. Why consider Aptos?
Become a part of a team that is passionate about creating and delivering cutting-edge solutions for retailers worldwide. At our company, we’re dedicated to supporting your career aspirations and helping you exceed your goals. You’ll benefit from industry-leading training, global development opportunities, and the chance to collaborate within a diverse culture across our offices in nine countries. Our inclusive culture reflects our purpose: to make a difference for every colleague, every client, every day.
As a leading provider of Unified Commerce solutions for retail, our technology empowers top retail brands by optimizing product management, promotions, merchandising, and store operations. With the global shift toward our cloud-native, microservices architecture, opportunities for career growth have never been more exciting. Today, more than 100,000 retail stores in fashion, grocery, footwear, general merchandise, discount, and sporting goods rely on our solutions to generate nearly $2 trillion in annual revenue.
We hope you’ll join us in driving innovation and delivering impactful solutions as we continue leading the Unified Commerce revolution.
About the Role:
As a Principal Data Engineer - Platform, you will lead the data model for retail product and be responsible for architecting scalable and reliable data solutions that empower our data scientists, analysts, and business stakeholders to derive valuable insights and make data-driven decisions. You will play a crucial role in shaping our data strategy and fostering a data-centric culture within the organization. You will manage data governance and the security of the data in the Cloud. You will be using mature technologies and emerging data practices.
Duties / Responsibilities:
- Data Architecture and Design:
- Lead the design and architecture of robust, scalable, and efficient data warehousing, data lake, and data integration solutions for Aptos products.
- Define data models, data governance policies, and data security standards.
- Evaluate and recommend new data technologies and tools to optimize our data infrastructure.
- Develop Centralized Data Platform Architecture.
- Create data exchange standards to ensure reusability and a decoupled architecture.
- Data Pipeline Development and Optimization:
- Lead the development and maintenance of complex ETL/ELT pipelines to ingest, transform, and load data from various sources (structured, semi-structured, unstructured).
- Implement data quality checks and monitoring systems to ensure data accuracy and reliability.
- Optimize data pipelines for performance, scalability, and cost-efficiency.
- Technical Leadership and Mentorship:
- Provide technical leadership and guidance to a team of data engineers.
- Mentor and coach junior engineers, fostering their technical growth and development.
- Drive best practices in data engineering methodologies, coding standards, and testing.
- Define and document data quality rules and standards.
- Collaboration and Communication:
- Collaborate closely with data scientists, analysts, product managers, and other stakeholders to understand their data requirements and deliver effective solutions.
- Communicate complex technical concepts clearly and concisely to both technical and non-technical audiences.
- Participate in cross-functional projects and contribute to the overall technology strategy.
- Data Governance and Security:
- Implement and enforce data governance policies, ensuring data quality, integrity, and compliance.
- Work closely with security teams to implement and maintain data security measures.
- Problem Solving and Innovation:
- Troubleshoot and resolve complex data-related issues.
- Stay up-to-date with the latest trends and technologies in data engineering and advocate for their adoption where appropriate.
- Identify opportunities for innovation and improvement in our data infrastructure and processes.
Qualifications:
- Bachelor’s or Master’s degree in Computer Science, Engineering, or a related field.
- 12+ years of experience in data engineering, with a significant portion in a senior or lead role.
- 5 years of experience with ETL tools and 1 year of experience with Snowflake.
- Deep understanding of data warehousing concepts, data modeling techniques, and ETL/ELT processes.
- Extensive experience with SQL and relational databases (e.g., PostgreSQL, MySQL, SQL Server).
- Strong proficiency in at least one programming language relevant to data engineering (e.g., Python, Scala, Java).
- Working experience with Coalesce ETL (Nice to have).
- Experience with NoSQL databases (e.g., MongoDB, Cassandra).
- Experience with stream processing technologies (e.g., Kafka Streams, Flink).
- Experience with data visualization tools (e.g., Strategy ONE, Tableau).
- Knowledge of DevOps practices and CI/CD pipelines for data engineering.
- Hands-on experience with cloud-based data platforms and services (e.g., AWS, Azure, GCP), including data warehousing (e.g., Redshift, Snowflake, BigQuery), data lakes (e.g., S3, ADLS, GCS), and data processing services (e.g., Spark, EMR, Dataflow).
- Experience with big data technologies and frameworks (e.g., Hadoop, Spark, Kafka).
- Familiarity with data orchestration tools (e.g., Airflow, Apache NiFi, AWS Step Functions).
- Experience with data quality and data governance frameworks.
- Excellent problem-solving, analytical, and communication skills.
- Retail domain knowledge is an added advantage.
- Preferred relevant certifications (e.g., AWS Certified Data Engineer, Google Cloud Professional Data Engineer).
- Excellent written and verbal English skills.
- Proven ability to lead and mentor technical teams.
What We Offer:
- A challenging and rewarding work environment.
- Opportunities for professional growth and development.
- A collaborative and supportive team culture.
- Competitive salary and benefits package.
- The chance to make a significant impact on our data-driven strategy.
We offer a competitive total rewards package including a base salary determined based on the role, experience, skill set, and location. For those in eligible roles, discretionary incentive compensation may be awarded in recognition of individual achievements and contributions. We also offer a range of benefits and programs to meet employee needs, based on eligibility.
We are an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability, or status as a protected veteran. By submitting an application for this job, you acknowledge that any personal data or personally identifiable information that you provide to us will be processed in accordance with our Candidate Privacy Notice.
* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰
Tags: Airflow Architecture AWS Azure Big Data BigQuery Cassandra CI/CD Computer Science Dataflow Data governance Data pipelines Data quality Data strategy Data visualization Data Warehousing DevOps ELT Engineering ETL Flink GCP Google Cloud Hadoop Java Kafka Microservices MongoDB MySQL NiFi NoSQL Pipelines PostgreSQL Privacy Python RDBMS Redshift Scala Security Snowflake Spark SQL Step Functions Tableau Testing
Perks/benefits: Career development Competitive pay
More jobs like this
Explore more career opportunities
Find even more open roles below ordered by popularity of job title or skills/products/technologies used.