Software Engineer - Bigdata ( Java/Scala ,Python, Spark, SQL, AWS )
Bangalore, India
Nielsen
A global leader in audience insights, data and analytics, Nielsen shapes the future of media with accurate measurement of what people listen to and watch.As a Software Developer, you will be a contributor of a Scrum/DevOps team focusing on analyzing, developing, testing, and supporting highly complex application software build on Big Data Your primary objective is to ensure project goals are achieved and are aligned with business objectives. You will also work closely with your Scrum team and program team to test, develop, refine and implement quality software in production via standard Agile methodologies.
Responsibilities
- Build and test Cloud-based applications for new and existing backend systems to help facilitate development teams to migrate to the cloud with an emphasis on quality, best-practice coding standards, and cost-effectiveness
- Build platform reusable code and components that could be used by multiple project teams.
- Provide cloud integration development support to various project teams.
- Leverage modern design patterns and architectural principles to build platform reusable code and components that can be used across projects and teams
- Write both unit and integration tests, and develop automation tools for daily tasks
- Support product owner in defining future stories and tech lead in defining technical requirements for new initiatives
- Build rapid technical prototypes for early customer validation of new technologies
- Collaborate effectively with Data Science to understand, translate, and integrate methodologies into engineering build pipelines
- Collaborate with cross-functional teams and stakeholders to align development objectives with broader business goals
Key Skills
- Domain Expertise
- 3-5 years of hands-on software development with a bachelor’s degree in computer science, engineering
- Must have strong cloud Implementation expertise in cloud architecture.
- Must have very good knowledge of storage, network, compute services. Have sound knowledge in multi-zone, region-based designs.
- Must have the ability to provide solutions utilizing best practices for resilience, scalability, cloud optimization and security.
- A quick learner, who can pick up new technologies, program languages and frameworks in a short span of time
- Experience in software development using programming languages & tools/services: Java or Python or Scala and strong in SQL.
- Experience in big data processing tools/languages and distributed computing using Spark Scala or similar.
- Experience with orchestration tools: Apache Airflow or similar tools.
- Strong knowledge on Unix/Linux OS, commands, shell scripting, python, JSON, YAML.
- Agile scrum experience in application development is required.
- Strong knowledge in AWS S3, PostgreSQL or MySQL.
- Deployment and automation: Terraform, Cloud Formation would be a plus.
- Strong knowledge in Compute: EC2, EMR, AWS Lambda.
- Strong knowledge in GitLab /Bitbucket.
- AWS Certification is a plus.
- Strong verbal/written communication and interpersonal skills.
- Must have strong analytical and technical skills in troubleshooting and problem resolution.
Technical Skills
Mindset and attributes
* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰
Tags: Agile Airflow Architecture AWS Big Data Bitbucket Computer Science DevOps EC2 Engineering GitLab Java JSON Lambda Linux MySQL Pipelines PostgreSQL Python Scala Scrum Security Shell scripting Spark SQL Terraform Testing
Perks/benefits: Career development
More jobs like this
Explore more career opportunities
Find even more open roles below ordered by popularity of job title or skills/products/technologies used.