Senior Data Engineer

Metro Manila, Philippines - Remote

Intellectsoft

Discover excellence with Intellectsoft, a software development company shaping digital innovation since 2007. Elevate your business with our expert services.

View all jobs at Intellectsoft

Apply now Apply later

Intellectsoft is a software development company delivering innovative solutions since 2007. We operate across North America, Latin America, the Nordic region, the UK, and Europe.We specialize in industries like Fintech, Healthcare, EdTech, Construction, Hospitality, and more, partnering with startups, mid-sized businesses, and Fortune 500 companies to drive growth and scalability. Our clients include Jaguar Motors, Universal Pictures, Harley-Davidson, Qualcomm, and London Stock Exchange.Together, our team delivers solutions that make a difference. Learn more at www.intellectsoft.net

Our customer's product is an AI-powered platform that helps businesses make better decisions and work more efficiently. It uses advanced analytics and machine learning to analyze large amounts of data and provide useful insights and predictions. The platform is widely used in various industries, including healthcare, to optimize processes, improve customer experiences, and support innovation. It integrates easily with existing systems, making it easier for teams to make quick, data-driven decisions.ces to deliver cutting-edge solutions.

Requirements

  • Proficiency in SQL for data manipulation and querying large datasets.
  • Strong experience with Python for data processing and scripting.
  • Expertise in pySpark for distributed data processing and big data workflows.
  • Hands-on experience with Airflow for workflow orchestration and automation.
  • Deep understanding of Database Management Systems (DBMS), including design, optimization, and maintenance.
  • Solid knowledge of data modeling, ETL pipelines, and data integration.
  • Familiarity with cloud platforms such as AWS, GCP, or Azure.

Nice to have skills

  • Experience with other big data tools (e.g., Hadoop, Kafka, or Snowflake).
  • Knowledge of DevOps practices, including CI/CD for data pipelines.
  • Familiarity with containerization tools like Docker or Kubernetes.
  • Previous experience working in agile development teams.
  • Understanding of Machine Learning pipelines or frameworks.

Responsibilities

  • Design, develop, and maintain scalable data pipelines and ETL processes.
  • Build and optimize large-scale data processing frameworks using PySpark.
  • Create workflows and automate processes using Apache Airflow.
  • Manage, monitor, and enhance database performance and integrity.
  • Collaborate with cross-functional teams, including data analysts, scientists, and stakeholders, to understand data needs.
  • Ensure data quality, reliability, and compliance with industry standards.
  • Troubleshoot, debug, and optimize data pipelines and workflows.
  • Continuously evaluate and integrate new tools and technologies to enhance data infrastructure.

Benefits

  • 35 paid absence days per year for work-life balance of each specialist + 1 additional day for each following year of cooperation with the company
  • Up to 15 unused absence days can be add to income after 12 month of cooperation
  • Health insurance for you
  • Depreciation coverage for personal laptop usage for project needs
  • Udemy courses of your choice
  • Regular soft-skills trainings
  • Excellence Сenters meetups

Apply now Apply later

* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰

Job stats:  0  0  0
Category: Engineering Jobs

Tags: Agile Airflow AWS Azure Big Data CI/CD Data pipelines Data quality DevOps Docker ETL FinTech GCP Hadoop Kafka Kubernetes Machine Learning .NET Pipelines PySpark Python Snowflake SQL

Perks/benefits: Career development Gear Health care

Regions: Remote/Anywhere Asia/Pacific
Country: Philippines

More jobs like this