DTICI Snowflake Data Engineer T9
Bengaluru, Karnataka, India
Daimler Truck
We are one of the world's largest commercial vehicle manufacturers, with over 40 production sites around the globe and more than 100,000 employees.This team is core of Data & AI department for daimler truck helps developing world class AI platforms in various clouds(AWS, Azure) to support building analytics solutions, dashboards, ML models and Gen AI solutions across the globe.
Key Responsibilities:
- Design, develop, and maintain scalable data pipelines using Snowflake and other cloud-based tools.
- Implement data ingestion, transformation, and integration processes from various sources (e.g., APIs, flat files, databases).
- Optimize Snowflake performance through clustering, partitioning, and query tuning.
- Collaborate with data analysts, data scientists, and business stakeholders to understand data requirements.
- Ensure data quality, integrity, and security across all data pipelines and storage.
- Develop and maintain documentation related to data architecture, processes, and best practices.
- Monitor and troubleshoot data pipeline issues and ensure timely resolution.
- Working experience with tools like medallion architecture, Matillion, DBT models, SNP Glu are highly recommended
Note: Fixed benefits that apply to Daimler Truck, Daimler Buses, and Daimler Truck Financial Services. Among other things, the following benefits await you with us:
- Attractive compensation package
- Company pension plan
- Remote working
- Flexible working models, that adapt to individual life phases
- Health offers
- Individual development opportunities through our own Learning Academy as well as free access to LinkedIn Learning
- + two individual benefits
- Bachelor’s degree in Computer Science, Information Systems, or a related field.
- 2–3 years of experience in data engineering or a similar role.
- Strong hands-on experience with Snowflake (data modeling, performance tuning, SnowSQL, etc.).
- Proficiency in SQL and experience with scripting languages like Python or Shell.
- Experience with ETL/ELT tools such as dbt, Apache Airflow, Informatica, or Talend.
- Familiarity with cloud platforms (AWS, Azure, or GCP) and services like S3, Lambda, or Data Factory.
- Understanding of data warehousing concepts and best practices.
- Candidate should have excellent communication skills, willing to reskill, adopt and build strong stakeholder relationship
- An active team member, willing to go the miles and bring innovation at work
ADDITIONAL INFORMATION
We particularly welcome online applications from candidates with disabilities or similar impairments in direct response to this job advertisement. If you have any questions, you can contact the local disability officer once you have submitted your application form, who will gladly assist you in the onward application process: XXX@daimlertruck.com If you have any questions regarding the application process, please contact HR Services by e-mail: hrservices@daimlertruck.com.
* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰
Tags: Airflow APIs Architecture AWS Azure Clustering Computer Science Data pipelines Data quality Data Warehousing dbt ELT Engineering ETL GCP Generative AI Informatica Lambda Machine Learning Matillion ML models Pipelines Python Security Snowflake SQL Talend
Perks/benefits: Career development Flex hours Health care
More jobs like this
Explore more career opportunities
Find even more open roles below ordered by popularity of job title or skills/products/technologies used.