Integration Data Engineer / Python: Contractor
Atlanta, GA-%LABEL POSITION TYPE REMOTE ANY%
Kaizen Analytix
Rapid Delivery, Continuous Improvement | We understand that analytics is much more than just the math. It’s the story behind the data. We help businesses understand what happened and show how they can learn from these outcomes to predict and...Integration Data Engineer / Python: Contractor
Kaizen Analytix LLC, an analytics consulting services and product firm that gives clients unmatched speed to value through analytics solutions and actionable business insights, is seeking candidates for a talented Data Engineer to join our team.
As a Data Engineer, you will be expected to work independently, collect requirements, perform data mapping to warehouse, hands on prototyping, ad hoc processing of data files and transform them into structured data that can be ingested on a production basis. The ideal candidate will have a strong background in data engineering, including experience with Python, SQL Server, ETL processes, data modeling, and cloud platforms.
Responsibilities:
- Work with Business Teams to come up with Data Governance Policies.
- Implement processes to process Data Access Requests.
- Work with data SME’s and data architects to create end user friendly data models to support data warehouse setup.
- Possesses excellent knowledge of Relational Database Management Systems (DBMS) and data warehouse front-end tools.
- Be proficient in creating and maintaining database objects and stored procedures.
- Contributing team member to the design and support of data architecture, database design and integration, transformations, and load processes.
- Develop ETL processes using Python and related tools.
Job Requirements:
- Hands-on back end Python coding person to connect an assemble integrated dataset.
- Need to design and maintain the master data layer.
- Bachelors or master’s degree in computer science, engineering, or a related field.
- 3+ years in data warehouse – consuming, wrangling, validating, and developing pipelines for data.
- 5+ years of experience working with SQL and Python.
- Familiarity with the basic principles of distributed computing and data modeling.
- Excellent problem-solving and analytical skills, with the ability to troubleshoot complex data issues and optimize data processes.
- Experience with object-oriented design and coding and testing patterns, including experience with engineering software platforms and data infrastructures.
- Working experience with Dimensional Modeling.
- Working with other SQL databases like PostGres and SQLServer is a plus.
- Strong written and verbal communication skills.
- Be open to receiving constructive feedback.
- Ability to work in a fast-paced, rapidly growing company and handle a wide variety of challenges, deadlines, and a diverse array of contacts.
* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰
Tags: Architecture Computer Science Consulting Data governance Data warehouse Engineering ETL Pipelines PostgreSQL Prototyping Python RDBMS SQL Testing
More jobs like this
Explore more career opportunities
Find even more open roles below ordered by popularity of job title or skills/products/technologies used.