717 - Ssr/Sr Data Engineer

Córdoba, Córdoba Province, Argentina

Darwoft

Darwoft is an industry-leading custom software development company specialized in mobile and web app UX and development.

View all jobs at Darwoft

Apply now Apply later

Data Engineer

What Youll Bring to The Team:

We are seeking an experienced Data Engineer to design and maintain data pipelines that power analytics and advanced reporting on our clients Enterprise Data Platform. As a member of the R&D organization, you will manage critical business data assets, leveraging cloud-based big data tools to enable data-driven decisions. You will bring a strong focus on data quality and collaborate with security and compliance experts, understanding the unique data sovereignty requirements of a global customer base.

Responsibilities:

  • Craft and build reusable components, frameworks, and libraries at scale to support analytics products.
  • Recommend tools and techniques for efficient data movement, transformation, and storage to facilitate a high-performance data warehouse environment.
  • Build and deploy scalable data pipelines to power analytics and reporting across multiple source systems.
  • Deliver data platform infrastructure as code, managing deployment and configuration requirements as well as internal release documentation.
  • Identify and address data management issues to improve data quality.
  • Contribute to our culture, propose innovative solutions to industry challenges, provide constructive feedback, and help create a company that drives meaningful change.
  • Implement redundant systems, policies, and procedures for disaster recovery and data archiving to ensure data availability, protection, and integrity.
  • Plan for capacity and resource expansion to support data warehouse scalability.
  • Collaborate with cross-functional teams to define and deliver reports based on business requirements.
  • Participate in data warehouse improvement and growth projects.
  • Monitor system performance, optimize stored procedures, and improve query execution efficiency.
  • Ensure data security and privacy best practices are applied appropriately.
  • Troubleshoot data issues and perform root cause analysis to proactively resolve operational challenges.
  • Create, update, and maintain system documentation.

Ideal Candidate Profile:

  • Bachelors or Masters degree in Computer Science, Data Science, Information Science, or related field, or equivalent work experience.
  • 2+ years of experience with SQL on multiple database platforms.
  • 1+ years of experience working with cloud-based enterprise analytics platforms and/or data warehouse projects (Snowflake preferred).
  • Strong programming background in data science-focused languages such as Python, Scala, or R.
  • Solid understanding of both relational and NoSQL database modeling and schema design principles.
  • Experience working with large datasets and developing high-performance queries.
  • Hands-on experience with large-scale data migrations.
  • Strong knowledge of data security best practices.
  • Proficiency with Git and commitment to documentation best practices.
  • Ability to thrive in a hybrid work environment.
  • Positive and action-oriented mindset.
  • Strong interpersonal and communication skills, with the ability to ask the right questions.
  • Self-motivated and self-managing, with excellent task organization skills.
  • Ability to clearly and concisely communicate technical requirements and recommendations.
  • Proficiency in SQL & NoSQL databases (Snowflake, MongoDB), particularly in development or reporting.
  • Strong understanding of relational data structures, theories, and principles.
  • Experience mentoring and training other developers and engineers on data engineering best practices.
  • Strong knowledge of applicable data privacy regulations and best practices.
  • Strong SQL and schema comprehension skills, including many-to-many relationships.
  • Deep understanding of data structures and their implementation.

Bonus Points If You Have:

  • Experience with Snowflake, DBT, Fivetran, and Tableau.
  • Prior exposure to distributed data frameworks.
  • Proficiency in building modular applications.
  • Experience with Microservices and/or Service-Oriented Architecture.
  • Experience with database management and data operations.

How to Apply:

Interested candidates are encouraged to submit their resumes and a cover letter outlining their relevant experience and qualifications to talento@darwoft.com

Questions?
Follow the Recruiter
https://www.linkedin.com/in/hernanvietto/

Apply now Apply later

* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰

Job stats:  0  0  0
Category: Engineering Jobs

Tags: Architecture Big Data Computer Science Data management DataOps Data pipelines Data quality Data warehouse dbt Engineering FiveTran Git Microservices MongoDB NoSQL Pipelines Privacy Python R R&D Scala Security Snowflake SQL Tableau

Perks/benefits: Startup environment

Region: South America
Country: Argentina

More jobs like this