Senior Software Engineer
Lalitpur, Nepal
TechKraft Inc.
TechKraft is a global IT services and consulting company, unlocking opportunities for clients worldwide to outsource operations in strategic regions of the world.We are seeking an experienced Senior Data Engineer to join our team. This role offers an opportunity to be a key contributor in a critical feature delivery team, where your expertise will guide the evolution of our data pipeline infrastructure. Our team is responsible for the development and operation of data pipelines that handle diverse data sources through both large batch and streaming systems. You'll work extensively with AWS services and play a crucial role in driving the growth and innovation of our platform.
You Will:
1. Design and Develop Data Systems: Architect, build, and maintain data pipelines and ETL processes utilizing tools such as Databricks, Snowflake, SQL, and PySpark.
2. Enhance Data Quality: Play a pivotal role in creating and optimizing data assets to uphold high standards of data quality, performance, and reliability.
3. Manage Data Pipelines: Actively monitor and troubleshoot data pipelines to ensure efficient and uninterrupted data distribution.
4. Collaborate Across Teams: Partner with the Connector Factory team and cross-functional teams to understand client data requirements and transform these into scalable data solutions.
5. Implement Agile Practices: Apply Agile methodologies and best practices to drive incremental improvements and adapt to emerging requirements.
6. Communicate Effectively: Keep communication channels open with stakeholders to gather and clarify requirements and provide regular updates on project progress.
7. Ensure Data Security: Stay committed to data privacy, security, and regulatory compliance, particularly given the sensitive nature of healthcare data.
What We're Looking For:
1. Educational Background: Bachelor’s degree in computer science, Engineering, or a related field. Advanced degrees are a plus.
2. Extensive Experience: A minimum of 5 years of experience in data engineering and big data architecture.
3. Technical Expertise: Deep knowledge of designing and maintaining big data architectures, including data lakes, columnar databases, large batch processing (Spark), and stream processing (Kafka).
4. Cloud Proficiency: Strong experience with AWS data services and building scalable, distributed systems on cloud platforms.
5. Programming Skills: Proficiency in Python or other object-oriented programming languages.
6. Data Analysis Skills: Hands-on experience with data analysis and modeling of large data sets.
7. Project Management: Strong organizational skills and experience managing complex projects.
8. Root Cause Analysis: Proven ability to perform root cause analysis on data processes to improve efficiency and resolve business queries.
9. Adaptability: A willingness to learn about new technologies and adapt to changing environments.
10. Independent and Collaborative Work: Ability to self-direct tasks and effectively collaborate within a technical team.
11. Infrastructure Automation: Familiarity with tools such as Terraform and GitLab CI/CD for infrastructure automation.
12. Business Acumen: Comfort with ambiguity and a keen interest in solving business-related problems.
13. Agile Experience: Background working in an Agile delivery framework.
Bonus Points:
· Relevant certifications in data engineering, cloud computing, or specific technologies such as Databricks, Snowflake, or AWS.
* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰
Tags: Agile Architecture AWS Big Data CI/CD Computer Science Data analysis Databricks Data pipelines Data quality Distributed Systems Engineering ETL GitLab Kafka OOP Pipelines Privacy PySpark Python Security Snowflake Spark SQL Streaming Terraform
More jobs like this
Explore more career opportunities
Find even more open roles below ordered by popularity of job title or skills/products/technologies used.