Sr. Data Engineer - (Big Data, Spark, Scala, Python, AWS, RDBMS, SQL)

Gurgaon, HR, India

Nielsen

A global leader in audience insights, data and analytics, Nielsen shapes the future of media with accurate measurement of what people listen to and watch.

View all jobs at Nielsen

Apply now Apply later

At Nielsen, we are passionate about our work to power a better media future for all people by providing powerful insights that drive client decisions and deliver extraordinary results. Our talented, global workforce is dedicated to capturing audience engagement with content - wherever and whenever it’s consumed. Together, we are proudly rooted in our deep legacy as we stand at the forefront of the media revolution. When you join Nielsen, you will join a dynamic team committed to excellence, perseverance, and the ambition to make an impact together. We champion you, because when you succeed, we do too. We enable your best to power our future.

Responsibilities

  • Build scalable, reliable, cost-effective solutions for both the Cloud and on-premises with an emphasis on quality, best-practice coding standards, and cost-effectiveness
  • Build and test Cloud-based applications for new and existing backend systems to help facilitate development teams to migrate to the cloud. 
  • Build platform reusable code and components that could be used by multiple project teams.
  • Understand the enterprise architecture within the context of existing platforms, services and strategic direction.
  • Implement end-to-end solutions with sound technical architecture, in Big Data analytics framework along with customized solutions that are scalable, with primary focus on performance, quality, maintainability, cost and testability.
  • Drive innovative solutions within the platform to establish common components, while allowing customization of solutions for different products.
  • Develop design specifications, continuous build and deployment strategy to drive Agile methodology
  • Setup and manage expectations with consultants engaged in the projects.
  • Provide cloud integration development support to various project teams. 
  • Build rapid technical prototypes for early customer validation of new technologies
  • Collaborate effectively with Data Science to understand, translate, and integrate methodologies into engineering build pipelines 
  • Collaborate with product owners to translate complex business requirements into technical solutions, providing leadership in the design and architecture processes.
  • Provide expert apprenticeship to project teams on technology strategy, cultivating advanced skill sets in application engineering and implementing modern software engineering practices
  • Mentor junior team members, providing guidance and support in their professional development
  • Stay informed about the latest technology and methodology by participating in industry forums, having an active peer network, and engaging actively with customers
  • Cultivate a team environment focused on continuous learning, where innovative technologies are developed and refined through collaborative effort

Key Skills

  • Domain Expertise
  • Bachelor’s degree in computer science, engineering plus 4-8 years of experience in information technology solutions development. 
  • Must have strong cloud Implementation expertise in cloud architecture. 
  • Must have strong analytical and technical skills in troubleshooting and problem resolution. 
  • Must have the ability to provide solutions utilizing best practices for resilience, scalability, cloud optimization and security. 
  •  3+ years of experience: big data using Apache Spark in developing distributed processing. applications; building applications with immutable infrastructure in the AWS Cloud with automation technologies like Terraform or Ansible or Cloud Formation.

  • Technical Skills
  • Experience in software development using programming languages & tools/services: Java or Scala, Big Data, Hadoop, Spark, Spark SQL, Presto \ Hive, Cloud (preferably AWS), Docker, RDBMS (such as Postgres and/or Oracle), Linux, Shell scripting, GitLab, Airflow, Cassandra & Elasticsearch.
  • Experience in big data processing tools/languages using Apache Spark Scala.
  • Experience with orchestration tools: Apache Airflow or similar tools.
  • Strong knowledge on Unix/Linux OS, commands, shell scripting, python, JSON, YAML.
  • Agile scrum experience in application development is required. 
  • Strong knowledge  in AWS S3, PostgreSQL or MySQL.
  • Strong knowledge  in  AWS Compute: EC2, EMR, AWS Lambda.
  • Strong knowledge in Gitlab /Bitbucket .
  • AWS Certification is a plus
  • "Big data" systems and analysis
  • Experience with data warehouses or data lakes

  • Mindset and attributes
  • Very strong verbal and written communication skills
  • Advanced analytical and technical skills in troubleshooting and problem resolution.
  • Ability to coach, mentor and provide guidance to junior colleagues
Please be aware that job-seekers may be at risk of targeting by scammers seeking personal data or money. Nielsen recruiters will only contact you through official job boards, LinkedIn, or email with a nielsen.com domain. Be cautious of any outreach claiming to be from Nielsen via other messaging platforms or personal email addresses. Always verify that email communications come from an @nielsen.com address. If you're unsure about the authenticity of a job offer or communication, please contact Nielsen directly through our official website or verified social media channels.
Apply now Apply later

* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰

Job stats:  0  0  0

Tags: Agile Airflow Ansible Architecture AWS Big Data Bitbucket Cassandra Computer Science Data Analytics Docker EC2 Elasticsearch Engineering GitLab Hadoop Java JSON Lambda Linux MySQL Oracle Pipelines PostgreSQL Python RDBMS Scala Scrum Security Shell scripting Spark SQL Terraform

Perks/benefits: Career development

Region: Asia/Pacific
Country: India

More jobs like this