Principal Software Engineer

Lalitpur, Nepal

Apply now Apply later

About the Role: 
Do you have a passion for the craft of software engineering? Our Transformers engineering team is looking for a motivated, versatile, and naturally curious Software engineer who is excited about using cutting edge cloud technology to better the US healthcare industry. This is a fantastic opportunity for an engineer to join a world-class engineering team and work cross functionally with other teams within our company: Executives, Product, Implementation, Delivery, Customer Success, and Sales, to help solve our customers' most challenging business and operational needs. 

 

Specific Duties include the following:  

  • Develop and implement virtual, high performant cloud solutions which conform to US healthcare security standards by leveraging a broad level of experience across platforms like AWS, Azure, Databricks and Snowflake, realized through analytical work with end users, product managers and software/data architects. 
  • Build data processing pipelines leveraging AWS/Azure, Airbyte, Databricks, Snowflake and DBT 
  • Write PySpark, Python, and SQL code to meet requirements for clients or internal teams. 
  • Deploy code using CI/CD frameworks. 
  • Be able to critically analyze and review peer-authored designs and code. 
  • Employ exceptional problem-solving skills, with the ability to see and solve issues before they affect business productivity. 
  • Troubleshoot client reported incidents, identify root cause, fix, and document problems, and implement preventive measures. 
  • Optimize the performance and cost of data processing workflows. 
  • Demonstrate deep working knowledge of Airflow, ETL (Extract, Transform, Load) processes, APIs and data connectors and troubleshoot issues related to each. 
  • Drive the technical excellence of a team, mentor other team members and lead by example. 
  • Identify area of technical investments, work with stakeholders to prioritize them onto the roadmap and lead efforts to implement such investments. 

 

What We’re Looking For: 

  • Bachelor's degree, preferably in Computer Science, Computer Engineering, or related IT discipline. 
  • 8+ years of commercial software development experience  
  • 5+ years of building or using cloud services in a production environment (AWS, Azure, GCP, etc.). 
  • 3+ years of building ETL data pipelines at scale with Spark/PySpark and Databricks 
  • Strong programming skills (Python, Java, or other OOP Languages). 
  • Go-getter with self-starter mindset. 
  • Someone who stays current with emerging technologies and development techniques 
  • Excellent oral and written communication skills; strong analytical, problem solving, organization and prioritization skills 

 

Equal Opportunity Employer 
As a mission-led technology company that is helping to drive better healthcare outcomes, the company believes that the best innovation and value we can bring to our customers comes from diverse ideas, thoughts, experiences, and perspectives. We are dedicated to building diverse teams and providing equal employment opportunities to all applicants. The company prohibits discrimination and harassment of any type in regard to race, color, religion, age, sex, national origin, disability status, genetics, protected veteran status, sexual orientation, gender identity or expression, or any other characteristic protected by federal, state or local laws. 



Apply now Apply later

* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰

Job stats:  0  0  0
Category: Engineering Jobs

Tags: Airflow APIs AWS Azure CI/CD Computer Science Databricks Data pipelines dbt Engineering ETL GCP Java OOP Pipelines PySpark Python Security Snowflake Spark SQL Transformers

Region: Asia/Pacific
Country: Nepal

More jobs like this