Data Analytics Engineer

Bengaluru, IN

Apply now Apply later

About SKF

 

SKF started its operations in India in 1923. Today, SKF provides industry leading automotive and industrial engineered solutions through its five technology-centric platforms: bearings and units, seals, mechatronics, lubrication solutions and services. Over the years the company has evolved from being a pioneer ball bearing manufacturing company to a knowledge-driven engineering company helping customers achieve sustainable and competitive business excellence.

 

 SKF's solutions provide sustainable ways for companies across the automotive and industrial sectors to achieve breakthroughs in friction reduction, energy efficiency, and equipment longevity and reliability. With a strong commitment to research-based innovation, SKF India offers customized value-added solutions that integrate all its five technology platforms.

To know more, please visit: www.skf.com/in

 

 SKF India has been recognized as a "Top Employer 2024" by the Top Employers Institute, acknowledging excellence in People Practices.

To know more, Please visit: https://www.skf.com/in/news-and-events/news/2024/2024-01-18-skf-india- recognized-as-top-employer-2024-by-top-employers-institute 

SKF Purpose Statement

Together, we re-imagine rotation for a better tomorrow.

By creating intelligent and clean solutions for people and the planet

 

 

JOB DESCRIPTION

 

  Job Title:        Data Analytics Engineer

  Reports To:     Lead Data Analytics & Projects

  Role Type:      Individual Contributor

  Location:        Bengaluru

 

Role Purpose:

Design, build, and maintain the data infrastructure and systems that support SKF Commercial excellence data needs. By leveraging their skills in data modeling, data integration, data processing, data storage, data retrieval, data transformation and performance optimization and data reporting, this role can help commercial excellence data analytics team to manage and utilize their data more effectively.

 

Primary Responsibilities:

  • Data Warehouse Development: Design and develop scalable, secure, and compliant data warehouses using Snowflake and dbt technologies. This includes creating Snowflake data models and performing transformations in dbt.
  • Analytical Skills: Exhibit strong problem-solving abilities and attention to detail. Analyse and organize raw data and generate reports using Power BI.
  • SQL Expertise: Demonstrate proficiency in SQL, including writing complex queries and optimizing SQL performance in Snowflake and dbt.
  • ETL/ELT Processes: Familiarity with Snowflake’s ETL/ELT tools and techniques, as well as dbt’s transformation methods.
  • Data Modeling: Possess a solid understanding of data modeling concepts and be adept with Snowflake’s and dbt’s data modeling tools and techniques.
  • Performance Optimization: Optimize the performance of Snowflake and dbt queries and data loading processes. This includes optimizing SQL queries, creating indexes, tuning data loading processes, and automating manual tasks.
  • Security and Access Management: Manage the security and access controls of the Snowflake environment, including configuring user roles and permissions, managing encryption keys, and monitoring access logs.
  • Database Maintenance: Maintain existing databases and warehouse solutions, addressing support needs, enhancements, and troubleshooting.
  • Collaboration: Work closely with data scientists, analysts, and other stakeholders to understand and meet data requirements.
  • Documentation and Reporting: Document data processes and pipeline architecture using dbt.
  • Training and Support: Provide training and support to team members on the usage of dbt and Snowflake.

 

  • dbt (data build tool) Specific Skills:
    • Transform raw data into structured formats using dbt.
    • Implement data quality checks and testing within dbt to ensure accuracy and consistency.
    • Maintain data accuracy and consistency throughout the pipeline.
    • Document data models, transformations, and processes comprehensively.
    • Optimize dbt models and transformations for maximum efficiency.
    • Manage dbt projects using version control systems like Git.

 

  • Technical Skills:
    • Extensive proficiency in Snowflake, dbt, and Power BI.
    • Experience with cloud services such as AWS and Azure.
    • Proficient in programming languages including SQL, Python, and Java.
 Candidate Profile:
  • Preferred Experience: Experience as a data engineer in the engineering or IT industry is advantageous.
  • Educational Background: Bachelor’s degree in computer science, Information Technology, or a related field.
  • Professional Experience: 6-8 years of overall experience in data engineering.and relating to others.

 

 

 

 

Apply now Apply later
  • Share this job via
  • 𝕏
  • or

* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰

Job stats:  0  0  0

Tags: Architecture AWS Azure Computer Science Data Analytics Data quality Data warehouse dbt ELT Engineering ETL Git Industrial Java Power BI Python Research Security Snowflake SQL Testing

Perks/benefits: Team events

Region: Asia/Pacific
Country: India

More jobs like this