Senior Data Engineer
3010 - Bangalore Office, India
GoTo
Businesses of all sizes trust GoTo to power business growth and support customers. Explore our business phone system, contact center, and IT support products.Job Description
Where you’ll work: India (Remote)
Engineering at GoTo
We’re the trailblazers of remote work technology. We build powerful, flexible work software that empowers everyone to live their best life, at work and beyond. And blaze even more trails along the way. There’s ample room for growth – so you can blaze your own trail here too. When you join a GoTo product team, you’ll take on a key role in this process and see your work be used by millions of users worldwide.
Your Day to Day
As a Senior Data Engineer, you would be:
Design and Develop Pipelines: Build robust, scalable, and efficient ETL/ELT data pipelines to process structured data from diverse sources.
Big Data Processing: Develop and optimize large-scale data workflows using Apache Spark, with strong hands-on experience in building ETL pipelines.
Cloud-Native Data Solutions: Architect and implement data solutions using AWS services such as S3, EMR, Lambda, and EKS.
Data Governance: Manage and govern data using catalogs like Hive or Unity Catalog; ensure strong data lineage, access controls, and metadata management.
Workflow Orchestration: Schedule, monitor, and orchestrate workflows using Apache Airflow or similar tools.
Data Quality & Monitoring: Implement quality checks, logging, monitoring, and alerting to ensure pipeline reliability and visibility.
Cross-Functional Collaboration: Partner with analysts, data scientists, and business stakeholders to deliver high-quality data for applications and enable self-service BI.
Compliance & Security: Uphold best practices in data governance, security, and compliance across the data ecosystem.
Mentorship & Standards: Mentor junior engineers and help evolve engineering practices including CI/CD, testing, and documentation.
What We’re Looking For
As a Senior Data Engineer, your background will look like:
Bachelor’s or Master’s degree in Computer Science, Engineering, or a related field.
5+ years of experience in data engineering or software development, with a proven record of maintaining production-grade pipelines.
Proficient in Python and SQL for data transformation and analytics.
Strong expertise in Apache Spark, including data lake management, ACID transactions, schema enforcement/evolution, and time travel.
In-depth knowledge of AWS services—especially S3, EMR, Lambda, and EKS—with a solid grasp of cloud architecture and security best practices.
Solid data modeling skills (dimensional, normalized) and an understanding of data warehousing and lakehouse paradigms.
Experience with BI tools like Tableau or Power BI.
Familiar with setting up data quality, monitoring, and observability frameworks.
Excellent communication and collaboration skills, with the ability to thrive in an agile and multicultural team environment.
Nice to Have
Experience working on the Databricks Platform
Knowledge of Delta or Apache Iceberg file formats
Passion for Machine Learning and AI; enthusiasm to explore and apply intelligent systems
At GoTo, authenticity and inclusive culture are key to our thriving workplace, where diverse perspectives drive innovation and growth. Our team of GoGetters is passionate about learning, exploring, and working together to achieve success while staying committed to delivering exceptional experiences for our customers. We take pride in supporting our employees with comprehensive
benefits, wellness programs, and global opportunities for professional and personal development. By maintaining an inclusive environment, we empower our teams to do their best work, make a meaningful impact, and grow their career. Learn more.
* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰
Tags: Agile Airflow Architecture AWS Big Data CI/CD Computer Science Databricks Data governance Data pipelines Data quality Data Warehousing ELT Engineering ETL Lambda Machine Learning Pipelines Power BI Python Security Spark SQL Tableau Testing
Perks/benefits: Career development Flex hours
More jobs like this
Explore more career opportunities
Find even more open roles below ordered by popularity of job title or skills/products/technologies used.