Consultant II - Data Engineer, Python
Jaipur, Rajasthan
Hakkōda
Hakkoda is a modern data consultancy, helping customers harness cloud platforms and AI capabilities for innovative results in the real world.Hakkoda, an IBM Company, is a modern data consultancy that empowers data driven organizations to realize the full value of the Snowflake Data Cloud. We provide consulting and managed services in data architecture, data engineering, analytics and data science. We are renowned for bringing our clients deep expertise, being easy to work with, and being an amazing place to work! We are looking for curious and creative individuals who want to be part of a fast-paced, dynamic environment, where everyone’s input and efforts are valued. We hire outstanding individuals and give them the opportunity to thrive in a collaborative atmosphere that values learning, growth, and hard work. Our team is distributed across North America, Latin America, India and Europe. If you have the desire to be a part of an exciting, challenging, and rapidly-growing Snowflake consulting services company, and if you are passionate about making a difference in this world, we would love to talk to you!.
We are seeking a skilled and collaborative Sr. Data/Python Engineer with experience in the development of production Python-based applications (Such as Django, Flask, FastAPI on AWS) to support our data platform initiatives and application development. This role will initially focus on building and optimizing Streamlit application development frameworks, CI/CD Pipelines, ensuring code reliability through automated testing with Pytest, and enabling team members to deliver updates via CI/CD pipelines. Once the deployment framework is implemented, the Sr Engineer will own and drive data transformation pipelines in dbt and implement a data quality framework.
Key Responsibilities:
- Lead application testing and productionalization of applications built on top of Snowflake - This includes implementation and execution of unit testing and integration testing - Automated test suites include use of Pytest and Streamlit App Tests to ensure code quality, data accuracy, and system reliability.
- Development and Integration of CI/CD pipelines (e.g., GitHub Actions, Azure DevOps, or GitLab CI) for consistent deployments across dev, staging, and production environments.
- Development and testing of AWS-based pipelines - AWS Glue, Airflow (MWAA), S3.
- Design, develop, and optimize data models and transformation pipelines in Snowflake using SQL and Python.
- Build Streamlit-based applications to enable internal stakeholders to explore and interact with data and models.
- Collaborate with team members and application developers to align requirements and ensure secure, scalable solutions.
- Monitor data pipelines and application performance, optimizing for speed, cost, and user experience.
- Create end-user technical documentation and contribute to knowledge sharing across engineering and analytics teams.
- Work in CST hours and collaborate with onshore and offshore teams.
Qualifications, Skills & Experience
- 5+ years of experience in Data Engineering or Python based application development on AWS (Flask, Django, FastAPI, Streamlit) - Experience building data data-intensive applications on python as well as data pipelines on AWS in a must.
- Bachelor’s degree in computer science, Information Systems, Data Engineering, or a related field (or equivalent experience).
- Proficient in SQL and Python for data manipulation and automation tasks.
- Experience with developing and productionalizing applications built on Python based Frameworks such as FastAPI, Django, Flask.
- Experience with application frameworks such as Streamlit, Angular, React etc for rapid data app deployment.
- Solid understanding of software testing principles and experience using Pytest or similar Python frameworks.
- Experience configuring and maintaining CI/CD pipelines for automated testing and deployment.
- Familiarity with version control systems such as Gitlab.
- Knowledge of data governance, security best practices, and role-based access control (RBAC) in Snowflake.
Preferred Qualifications:
- Experience with dbt (data build tool) for transformation modeling.
- Knowledge of Snowflake’s advanced features (e.g., masking policies, external functions, Snowpark).
- Exposure to cloud platforms (e.g., AWS, Azure, GCP).
- Strong communication and documentation skills.
- Health Insurance- Paid leave- Technical training and certifications- Robust learning and development opportunities- Incentive- Toastmasters- Food Program- Fitness Program- Referral Bonus Program
Hakkoda is committed to fostering diversity, equity, and inclusion within our teams. A diverse workforce enhances our ability to serve clients and enriches our culture. We encourage candidates of all races, genders, sexual orientations, abilities, and experiences to apply, creating a workplace where everyone can succeed and thrive.
Ready to take your career to the next level? 🚀 💻 Apply today👇 and join a team that’s shaping the future!!
Hakkoda is an IBM subsidiary which has been acquired by IBM and will be integrated in the IBM organization. Hakkoda will be the hiring entity. By Proceeding with this application, you understand that Hakkoda will share your personal information with other IBM subsidiaries involved in your recruitment process, wherever these are located. More information on how IBM protects your personal information, including the safeguards in case of cross-border data transfer, are available here.
* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰
Tags: Airflow Angular Architecture AWS AWS Glue Azure CI/CD Computer Science Consulting Data governance Data pipelines Data quality dbt DevOps Django Engineering FastAPI Flask GCP GitHub GitLab Pipelines Python React Security Snowflake SQL Streamlit Testing
Perks/benefits: Career development Health care Salary bonus
More jobs like this
Explore more career opportunities
Find even more open roles below ordered by popularity of job title or skills/products/technologies used.