Senior Data Engineer (GCP) - AI Platforms

Porto Alegre, Rio Grande do Sul, Brazil; São Paulo, São Paulo, Brazil; Remote, Brazil

TELUS Digital Brazil

By choosing TELUS you get access to amazing plans, phones and high-speed internet on Canada's most-awarded network, and help connect Canadians in need.

View all jobs at TELUS Digital Brazil

Apply now Apply later

Senior Data Engineer (GCP)

Who We Are

Welcome to TELUS Digital — where innovation drives impact at a global scale. As an award-winning digital product consultancy and the digital division of TELUS, one of Canada’s largest telecommunications providers, we design and deliver transformative customer experiences through cutting-edge technology, agile thinking, and a people-first culture.

With a global team across North America, South America, Central America, Europe, and APAC, we offer end-to-end expertise across eight core service areas: Digital Product Consulting, Digital Marketing Services, Data & AI, Strategy Consulting, Business Operations Modernization, Enterprise Applications, Cloud Engineering, and QA & Test Engineering.

From mobile apps and websites to voice UI, chatbots, AI, customer service, and in-store solutions, TELUS Digital enables seamless, trusted, and digitally powered experiences that meet customers wherever they are — all backed by the secure infrastructure and scale of our multi-billion-dollar parent company.

Location and Flexibility

This role can be fully remote for candidates based in the states of São Paulo and Rio Grande do Sul as well as in the cities of Rio de Janeiro, Belo Horizonte, Florianópolis and Fortaleza due to team distribution and occasional in-person opportunities. If you are based in São Paulo or Porto Alegre, you are welcome to work from one of our offices on a flexible schedule.

The Opportunity

As a Data Engineer part of our growing Fuel IX team, you will be responsible for designing, implementing, and maintaining robust and scalable data pipelines, enabling efficient data integration, storage, and processing across our various data sources. You will collaborate with cross-functional teams, including Data Scientists, Software Engineers, and other technical stakeholders, to ensure data quality and support data-driven decision-making.

Responsibilities 

  • Develop and optimize scalable, high-performing, secure, and reliable data pipelines that address diverse business needs and considerations
  • Identify opportunities to enhance internal processes, implement automation to streamline manual tasks, and contribute to infrastructure redesign 
  • Help mentor and coach a product team towards shared goals and outcomes
  • Navigate difficult conversations by providing constructive feedback to teams
  • Identify obstacles to ensure quality, improve our user experience and how we build tests
  • Be self-aware of limitations, yet curious to learn new solutions while being receptive to constructive feedback from teammates
  • Engage in ongoing research and adoption of new technologies, libraries, frameworks, and best practices to enhance the capabilities of the data team

Qualifications

  • 5+ years of relevant development experience writing high-quality code as a Data Engineer
  • Have actively participated in the design and development of data architectures
  • Hands-on experience in developing and optimizing data pipelines
  • Comprehensive understanding of data modeling, ETL processes, and both SQL and NoSQL databases
  • Experience with a general-purpose programming language such as Python or Scala
  • Experience with GCP platforms and services.
  • Experience with containerization technologies such as Docker and Kubernetes
  • Proven track record in implementing and optimizing data warehousing solutions and data lakes
  • Proficiency in DevOps practices and automation tools for continuous integration and deployment of data solutions
  • Experience with machine learning workflows and supporting data scientists in model deployment
  • Solid understanding of data security and compliance requirements in large-scale data environments
  • Strong ability to communicate effectively with teams and stakeholders, providing and receiving feedback to improve product outcomes.
  • Proficient in communicating and writing in English

Bonus Points

  • Big data tools such as Hadoop, Spark or Kafka
  • Orchestration tools such as Airflow
  • Agile development environment and familiarity with Agile methodologies

 

Apply now Apply later

* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰

Job stats:  0  0  0

Tags: Agile Airflow Architecture Big Data Chatbots Consulting Data pipelines Data quality Data Warehousing DevOps Docker Engineering ETL GCP Hadoop Kafka Kubernetes Machine Learning Model deployment NoSQL Pipelines Python Research Scala Security Spark SQL

Perks/benefits: Career development Flex hours

Regions: Remote/Anywhere Europe South America
Countries: Brazil Portugal