Senior Data Engineer

Hyderabad, India

New Relic

Sign up for free, no credit card required with New Relic the all-in-one observability platform for engineers to monitor, debug, and improve their entire stack.

View all jobs at New Relic

Apply now Apply later

Your Opportunity

New Relic is looking for an experienced Senior Data Engineer with expertise in Java, Python, Big data, data modeling and experience in any of cloud environments (AWS, GCP, Azure etc) in supporting our FinOps Organization. They will lead the design, development, and optimization of our data infrastructure. The successful candidate will have a strong background in data engineering, big data technologies, and leadership capabilities to guide and mentor the team. This role is pivotal in ensuring that our data systems are robust, scalable, and align with business needs.


What You'll Do
  • Lead building of scalable, fault-tolerant pipelines with built-in data quality checks that transform, load, and curate data from various internal and external systems.
  • Provide leadership to cross-functional initiatives and projects.
  • Influence architecture design and decisions.
  • Build cross-functional relationships with Data Scientists, Product Managers, and Software Engineers to understand data needs and deliver on those needs.
  • Improve engineering processes and cross-team collaboration.
  • Ruthlessly prioritize work to align with company priorities.
  • Provide thought leadership to grow and evolve the Data Engineering function and the implementation of SDLC best practices in building internal-facing data products by staying up-to-date with industry trends, emerging technologies, and best practices in data engineering.
This role requires
  • 5+ years experience and knowledge of building data lakes in AWS (e.g., Spark/Glue, Athena), including data modeling, data quality best practices, and self-service tooling.
  • Development experience in at least one object-oriented language (Java, Python, R, Scala, etc.).
  • Expert at SQL
  • Strong experience with dbt, Airflow.
  • Experience in BI and Data Warehousing.
  • Experience with Apache Iceberg tables.
  • Expertise in architecting and building solutions on any of the databases (Cassandra, DynamoDB, Elasticsearch, Aurora, Redshift etc)
  • Experience with automation and Orchestration tools
  • Built enterprise products from scratch, worked at product based companies
  • Mentor junior Engineers on the team
  • Has a high bar on Design, Architecture for scalable systems
  • Experience with CI/CD tools such as Jenkins, Gitlab CI etc
  • Demonstrated success leading cross-functional initiatives.Passionate about data quality, code quality, SLAs, and continuous improvement.
  • Deep understanding of data system architecture.
  • Deep understanding of ETL/ELT patterns.

 

Bonus points if you have 
  • Experience with FinOps industry and best practices
  • FinOps Certified Practitioner preferred Infrastructure Cost and efficiency 
  • Cloud cost optimisation

 

Please note Visa sponsorship is not available for this position

   

Fostering a diverse, welcoming and inclusive environment is important to us. We work hard to make everyone feel comfortable bringing their best, most authentic selves to work every day. We celebrate our talented Relics’ different backgrounds and abilities, and recognize the different paths they took to reach us – including nontraditional ones. Their experiences and perspectives inspire us to make our products and company the best they can be. We’re looking for people who feel connected to our mission and values, not just candidates who check off all the boxes. 

If you require a reasonable accommodation to complete any part of the application or recruiting process, please reach out to resume@newrelic.com.

We believe in empowering all Relics to achieve professional and business success through a flexible workforce model. This model allows us to work in a variety of workplaces that best support our success, including fully office-based, fully remote, or hybrid.

Our hiring process

In compliance with applicable law, all persons hired will be required to verify identity and eligibility to work and to complete employment eligibility verification. Note: Our stewardship of the data of thousands of customers’ means that a criminal background check is required to join New Relic.

We will consider qualified applicants with arrest and conviction records based on individual circumstances and in accordance with applicable law including, but not limited to, the San Francisco Fair Chance Ordinance.

Headhunters and recruitment agencies may not submit resumes/CVs through this website or directly to managers. New Relic does not accept unsolicited headhunter and agency resumes, and will not pay fees to any third-party agency or company that does not have a signed agreement with New Relic.

Candidates are evaluated based on qualifications, regardless of race, religion, ethnicity, national origin, sex, sexual orientation, gender expression or identity, age, disability, neurodiversity, veteran or marital status, political viewpoint, or other legally protected characteristics. 

Review our Applicant Privacy Notice at https://newrelic.com/termsandconditions/applicant-privacy-policy

Apply now Apply later

* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰

Job stats:  0  0  0
Category: Engineering Jobs

Tags: Airflow Architecture Athena AWS Azure Big Data Cassandra CI/CD Data quality Data Warehousing dbt DynamoDB Elasticsearch ELT Engineering ETL GCP GitLab Java Jenkins Pipelines Privacy Python R Redshift Scala SDLC Spark SQL

Regions: Remote/Anywhere Asia/Pacific
Country: India

More jobs like this