Data Engineer for Data-driven solutions development

Campinas

Applications have closed

4flow

Discover customized logistics solutions with 4flow that increase your efficiency and reduce costs!

View all jobs at 4flow

We're looking for a talented Data Engineer to join our growing team and contribute to the development of data-driven software solutions. You'll leverage your expertise in ETL pipelines and MLOps to identify opportunities, design solutions, and translate them into real-world applications

What your new challenge will look like

  • Partner with stakeholders across the organization to understand business needs and translate them into technical requirements

  • Implement and optimize data pipelines, ensuring data quality and integrity

  • Be part of the development of a new solution powered by genAI technology (This may involve working with software engineers or coding yourself depending on the project)

  • Continuously monitor and evaluate the performance of solutions and applications to ensure effectiveness

  • Communicate complex technical concepts to both technical and non-technical audiences

  • Collaborate with Data Scientists to design and build scalable and efficient data architectures

  • Automate data workflows and support the maintenance of existing data infrastructure

Why you belong at 4flow

  • You have a good university degree in Computer Engineering, Statistics, Computer Science, or a related field (or equivalent experience)

  • Development and support of real-time data ingestion processes from various data sources

  • Provide support in requirements definition, estimation, development, and delivery of robust and scalable solutions

  • You know how to create pipelines and deploy applications using cloud platforms such as Azure, AWS and GCP

  • You have practical experience in Python as well as with SQL or NoSQL and Docker

  • Experience with ETL tools and data integration frameworks

  • DevOps: Good Understanding how to implement DevOps and CI/CD

  • MLOps: Support Data Scientists in tasks to scale and automate machine learning projects

  • Experience with data warehousing solutions such as Snowflake, Redshift, or BigQuery

  • APIs and Integration: Familiarity with creating and consuming APIs for data integration

  • Version Control: Strong understanding of version control systems, particularly Git

  • You communicate confidently in English

Bonus points for:

  • Experience being part of software development projects

  • Knowledge of data visualization tools such as Tableau, Power BI, or similar

  • Familiarity with big data technologies like Hadoop, Spark or Kafka

  • Experience with container orchestration tools like Kubernetes

  • Understanding of data governance and data security best practices

  • Experience with a specific domain relevant to our industry (please specify in your cover letter)

What we offer

Come join us! 4flow offers a clearly defined vision, excellent job security, and outstanding opportunities for your individual development. As part of a highly international, fast-growing company with a vibrant corporate culture, you will enjoy a competitive base salary, an attractive bonus system, and a great benefits package.

Ready for 4flow? Then please apply online. Please upload an English version of your resume when applying

* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰

Job stats:  0  0  0
Category: Engineering Jobs

Tags: APIs Architecture AWS Azure Big Data BigQuery CI/CD Computer Science Data governance Data pipelines Data quality Data visualization Data Warehousing DevOps Docker Engineering ETL GCP Generative AI Git Hadoop Kafka Kubernetes Machine Learning MLOps NoSQL Pipelines Power BI Python Redshift Security Snowflake Spark SQL Statistics Tableau

Perks/benefits: Career development Competitive pay

Region: South America
Country: Brazil

More jobs like this