Senior Data Engineer
Trivandrum, Kerala, India
Envestnet
Explore our connected ecosystem of solutions, intelligence, and technologies that connect people’s daily lives with their long-term goals. See how we’re equipping advisors with the tools and resources needed to deliver the most impactful...- Deliver end-to-end data and analytics capabilities, including data ingest, data transformation, data science, and data visualization in collaboration with Data and Analytics stakeholder groups
- Design and deploy databases and data pipelines to support analytics projects
- Develop scalable and fault-tolerant workflows
- Clearly document issues, solutions, findings and recommendations to be shared internally & externally
- Learn and apply tools and technologies proficiently, including:
- Languages: Python, PySpark, ANSI SQL, Python ML libraries
- Frameworks/Platform: Spark, Snowflake, Airflow, Hadoop , Kafka
- Cloud Computing: AWS
- Tools/Products: PyCharm, Jupyter, Tableau, PowerBI
- Performance optimization for queries and dashboards
- Develop and deliver clear, compelling briefings to internal and external stakeholders on findings, recommendations, and solutions
- Analyze client data & systems to determine whether requirements can be met
- Test and validate data pipelines, transformations, datasets, reports, and dashboards built by team
- Develop and communicate solutions architectures and present solutions to both business and technical stakeholders
- Provide end user support to other data engineers and analysts
- Expert experience in the following[Should have/Good to have]:
- SQL, Python, PySpark, Python ML libraries. Other programming languages (R, Scala, SAS, Java, etc.) are a plus
- Data and analytics technologies including SQL/NoSQL/Graph databases, ETL, and BI
- Knowledge of CI/CD and related tools such as Gitlab, AWS CodeCommit etc.
- AWS services including EMR, Glue, Athena, Batch, Lambda CloudWatch, DynamoDB, EC2, CloudFormation, IAM and EDS
- Exposure to Snowflake and Airflow.
- Solid scripting skills (e.g., bash/shell scripts, Python)
- Proven work experience in the following:
- Data streaming technologies
- Big Data technologies including, Hadoop, Spark, Hive, Teradata, etc.
- Linux command-line operations
- Networking knowledge (OSI network layers, TCP/IP, virtualization)
- Candidate should be able to lead the team, communicate with business, gather and interpret business requirements
- Experience with agile delivery methodologies using Jira or similar tools
- Experience working with remote teams
- AWS Solutions Architect / Developer / Data Analytics Specialty certifications, Professional certification is a plus
- Bachelor Degree in Computer Science relevant field, Masters Degree is a plus
* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰
Tags: Agile Airflow Architecture Athena AWS Big Data CI/CD CloudFormation Computer Science Data Analytics Data pipelines Data visualization DynamoDB EC2 ETL GitLab Hadoop Java Jira Jupyter Kafka Lambda Linux Machine Learning NoSQL Pipelines Power BI PySpark Python R SAS Scala Snowflake Spark SQL Streaming Tableau Teradata
More jobs like this
Explore more career opportunities
Find even more open roles below ordered by popularity of job title or skills/products/technologies used.