Data Ops
Porto, Portugal
Celfocus
Celfocus is a European high-tech system integrator, providing professional services focused on creating business value through Analytics and Cognitive solutions –addressing Telecommunications, Energy & Utilities, Financial Services and other...Make an impact by working for sectors where technology is the enabler, everything is ground-breaking and there’s a constant need to be innovative.
Be part of the team that combines business knowledge, technological edge and a design experience. Our different backgrounds and know-how are key in developing solutions and experiences for digital clients.
Face challenges and learn other ways of thinking and seeing the world - there’s always room for your energy and creativity.
About the role
Data Ops is a collaborative approach that combines data engineering, data quality, and DevOps practices to streamline and automate the entire data lifecycle. The mission of a DataOps professional is to establish efficient processes and frameworks for data ingestion, integration, transformation, and delivery. They focus on optimizing data workflows, ensuring data quality and governance, and facilitating cross-functional collaboration between data teams and other stakeholders.
As a part of your job, you will:
- Setting up and maintaining the infrastructure necessary for data analytics, including servers, databases, and cloud services;
- Monitor the performance of the analytics infrastructure and tools, identifying and resolving issues as they arise;
- Automate processes wherever possible, including data processing onboarding and pipeline orchestrations;
- Create and maintain documentation on the analytics infrastructure and tools, including policies, procedures, and best practices.
What are we looking for?
- Availability to work outside regular hours (24/7 support);
- Proficiency in GCP;
- ETL Tools & Processes: Hands-on experience with ETL tools such as Talend, including designing and managing ETL workflows;
- Data Management: Strong understanding of Oracle and BigQuery technologies for data storage and retrieval;
- Data Reporting: Familiarity with reporting tools such as Looker, AtScale, Business Objects or QlikSense;
- Operating Systems: Competence in Linux/UNIX environments;
- Scheduling Tools: Experience with scheduling tools to automate and optimize workflows;
- Domain Knowledge: Insights into business operations, particularly in the Telecommunications sector;
- Infrastructure Knowledge: Understanding of complex, large-scale infrastructure systems;
- Programming Skills: Solid programming logic and proficiency in SQL, Python, or Scala;
- CI/CD Processes: Familiarity with Continuous Integration and Continuous Deployment pipelines using tools like GIT.
Personal traits:
Ability to adapt to different contexts, teams and Clients
Teamwork skills but also sense of autonomy
Motivation for international projects and ok if travel is included
Willingness to collaborate with other players
Strong communication skills
We want people who like to roll up their sleeves and open their minds. Believe this is you? Come join the Team!
* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰
Tags: BigQuery CI/CD Data Analytics Data management DataOps Data quality DevOps Engineering ETL GCP Git Linux Looker Oracle Pipelines Python Scala SQL Talend
More jobs like this
Explore more career opportunities
Find even more open roles below ordered by popularity of job title or skills/products/technologies used.