Software Engineer I
Lalitpur, Nepal
TechKraft Inc.
TechKraft is a global IT services and consulting company, unlocking opportunities for clients worldwide to outsource operations in strategic regions of the world.You
Will:
Design and Develop Data Pipelines: Create, develop, and maintain data pipelines and ETL processes using tools such as Databricks, Snowflake, SQL, and PySpark.
Optimize Data Assets: Contribute to the creation and optimization of data assets, ensuring high standards of data quality, performance, and reliability.
Monitor and Troubleshoot: Aid in the monitoring and troubleshooting of data pipelines to guarantee efficient and uninterrupted data distribution.
Team Collaboration: Work with Connector Factory team and cross-functional teams to understand client data requirements and convert them into scalable data solutions.
Apply Agile Methodologies: Implement Agile methodologies and best practices to achieve incremental improvements and adapt to evolving requirements.
Effective Communication: Maintain transparent communication with stakeholders to gather and clarify requirements, and to provide regular project updates.
Ensure Data Privacy and Compliance: Maintain a commitment to data privacy, security, and regulatory compliance, given the sensitive nature of healthcare data.
What We're Looking For:
Educational Background: Bachelor's degree in computer science or a related field.
Relevant Experience: At least 1 year of experience in data engineering, software engineering, or a related role. Fresh graduates are also welcome to apply.
Technical Proficiency: Proficiency in SQL and Python.
Healthcare Data Knowledge: Basic understanding of US healthcare data and terminology is a plus.
Problem-Solving Skills: Strong ability to efficiently tackle data engineering projects and resolve issues.
Agile Methodologies: Familiarity with Agile methodologies and project delivery practices.
Communication and Collaboration: Excellent communication skills, with the ability to collaborate effectively with team members and stakeholders.
Bonus
Points:
Relevant certifications in data engineering, cloud computing, or specific technologies such as Databricks, Snowflake, or AWS.
* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰
Tags: Agile AWS Computer Science Databricks Data pipelines Data quality Engineering ETL Pipelines Privacy PySpark Python Security Snowflake SQL
More jobs like this
Explore more career opportunities
Find even more open roles below ordered by popularity of job title or skills/products/technologies used.