Senior Associate L1 DE-Big Data AWS

Hyderabad, India

Apply now Apply later

Company Description

Publicis Sapient is a digital transformation partner helping established organizations get to their future, digitally-enabled state, both in the way they work and the way they serve their customers.We help unlock value through a start-up mindset and modern methods, fusing strategy, consulting and customer experience with agile engineering and problem-solving creativity.United by our core values and our purpose of helping people thrive in the brave pursuit of next, our 20,000+ people in 53 offices around the world combine experience across truly value.

Job Description

The Senior Associate People Senior Associate L1 in Data Engineering, you will translate client requirements into technical design, and implement components for data engineering solutions. Utilize a deep understanding of data integration and big data design principles in creating custom solutions or implementing package solutions. You will independently drive design discussions to ensure the necessary health of the overall solution 

Your Impact:

  • Data Ingestion, Integration and Transformation
  • Data Storage and Computation Frameworks, Performance Optimizations
  • Analytics & Visualizations
  • Infrastructure & Cloud Computing
  • Data Management Platforms
  • Build functionality for data ingestion from multiple heterogeneous sources in batch & real-time
  • Build functionality for data analytics, search and aggregation

Qualifications

Your Skills & Experience:

  • Minimum 2 years of experience in Big Data technologies
  • Hands-on experience with the Hadoop stack – HDFS, sqoop, kafka, Pulsar, NiFi, Spark, Spark Streaming, Flink, Storm, hive, oozie, airflow, and other components required in building end-to-end data pipelines. Working knowledge of real-time data pipelines is added advantage.
  • Strong experience in at least the programming language Java, Scala, and Python. Java preferable
  • Hands-on working knowledge of NoSQL and MPP data platforms like Hbase, MongoDB, Cassandra, AWS Redshift, Azure SQLDW, GCP BigQuery, etc.
  • Well-versed and working knowledge with data platform-related services on AWS
  • Bachelor’s degree and year of work experience of 4 to 5 years or any combination of education, training, and/or experience that demonstrates the ability to perform the duties of the position

Set Yourself Apart With:

  • Good knowledge of traditional ETL tools (Informatica, Talend, etc) and database technologies (Oracle, MySQL, SQL Server, Postgres) with hands-on experience
  • Knowledge of data governance processes (security, lineage, catalog) and tools like Collibra, Alation, etc
  • Knowledge of distributed messaging frameworks like ActiveMQ / RabbiMQ / Solace, search & indexing, and Microservices architectures
  • Performance tuning and optimization of data pipelines
  • Cloud data specialty and other related Big data technology certifications

A Tip from the Hiring Manager:

Join the team to sharpen your skills and expand your collaborative methods. Make an impact on our clients and their businesses directly through your work.

Additional Information

  • Gender-Neutral Policy
  • 18 paid holidays throughout the year
  • Generous parental leave and new parent transition program
  • Flexible work arrangements
  • Employee Assistance Programs to help you in wellness and well being  
Apply now Apply later
  • Share this job via
  • 𝕏
  • or

* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰

Job stats:  0  0  0
Category: Big Data Jobs

Tags: Agile Airflow Architecture AWS Azure Big Data BigQuery Cassandra Consulting CX Data Analytics Data governance Data management Data pipelines Engineering ETL Flink GCP Hadoop HBase HDFS Informatica Java Kafka Microservices MongoDB MPP MySQL NiFi NoSQL Oozie Oracle Pipelines PostgreSQL Pulsar Python Redshift Scala Security Spark SQL Streaming Talend

Perks/benefits: Career development Flex hours Health care Parental leave Wellness

Region: Asia/Pacific
Country: India

More jobs like this