Senior Software Engineer - Distribution Factory

Lalitpur, Nepal

TechKraft Inc.

TechKraft is a global IT services and consulting company, unlocking opportunities for clients worldwide to outsource operations in strategic regions of the world.

View all jobs at TechKraft Inc.

Apply now Apply later


As a Senior Software Engineer, you will play a pivotal role in shaping the future of healthcare data management. Our Data Distribution Team is dedicated to building and maintaining state-of-the-art data assets that empower our clients in the US healthcare domain to make informed decisions. You will collaborate closely with our Team Lead and cross-functional teams to design, develop, and optimize data pipelines, ETL processes, and data workflows using cutting-edge technologies such as Databricks, Snowflake, SQL, and PySpark

In this role, you will be at the forefront of our mission to deliver high-quality, reliable, and performant data solutions that address the evolving needs of our clients. Your expertise will directly contribute to the achievement of our overarching business goals, ensuring that our clients have access to the data they need to drive better healthcare outcomes. You will have the opportunity to work on complex data engineering projects, leveraging your skills in big data technologies and cloud-native architecture to create scalable and efficient data assets. 

You Will: 

  • Design, Develop, and Maintain Data Pipelines: Collaborate with the Cross Functional Teams to create robust data pipelines, ETL processes, and data workflows using Databricks, Snowflake, SQL, and PySpark. 
  • Optimize Data Assets: Ensure data quality, performance, and reliability by creating and optimizing data assets that meet both functional and non-functional business requirements. 
  • Monitor and Troubleshoot: Proactively monitor, troubleshoot, and optimize data pipelines to ensure smooth and efficient data distribution, addressing any issues that arise promptly. 
  • Collaborate with Cross-Functional Teams: Work closely with the Project Lead, Team Lead and cross-functional teams to understand client data requirements and translate them into scalable data solutions. 
  • Leverage Agile Methodologies: Use Agile methodologies and best practices to deliver incremental improvements and respond to changing requirements, ensuring successful project delivery. 
  • Mentor and Share Knowledge: Collaborate with team members to share knowledge, mentor junior engineers, and facilitate continuous learning and skill development within the team. 
  • Communicate Effectively: Maintain clear and effective communication with stakeholders, both technical and non-technical, to gather and clarify requirements and provide regular project updates. 
  • Ensure Data Privacy and Compliance: Uphold a strong commitment to data privacy, security, and regulatory compliance, considering the sensitive nature of healthcare data. 

What We're Looking For: 

  • Educational Background: Bachelor's or advanced degree in Computer Science, Data Engineering, or a related field. 
  • Relevant Experience: Minimum of 3 years of experience in data engineering, software engineering, or a related role. 
  • Technical Proficiency: Demonstrated expertise in Databricks, Snowflake, SQL, and PySpark. 
  • Big Data and Cloud Technologies: Familiarity with big data technologies, data processing frameworks, and cloud-native architecture. Proven expertise in cloud technologies, particularly AWS, with experience in services such as AWS Glue, S3, Redshift, and Lambda. 
  • Healthcare Data Knowledge: In-depth understanding of US healthcare data and terminology. 
  • Problem-Solving Skills: Strong problem-solving abilities, with a track record of working on complex data engineering projects. 
  • Communication and Collaboration: Excellent communication skills, with the ability to collaborate effectively with both technical and non-technical stakeholders. 

Bonus Points: 

  • Advanced Certifications: Relevant certifications in data engineering, cloud computing, or specific technologies such as Databricks, Snowflake, or AWS. 
  • Data Visualization Skills: Experience with data visualization tools such as Tableau, Power BI, or similar. 


Apply now Apply later

* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰

Job stats:  0  0  0

Tags: Agile Architecture AWS AWS Glue Big Data Computer Science Databricks Data management Data pipelines Data quality Data visualization Engineering ETL Lambda Pipelines Power BI Privacy PySpark Redshift Security Snowflake SQL Tableau

Perks/benefits: Career development

Region: Asia/Pacific
Country: Nepal

More jobs like this