Big Data Engineer - Entry Level

CRI-Sabana, Costa Rica

⚠️ We'll shut down after Aug 1st - try foo🦍 for all jobs in tech ⚠️

Equifax

Get credit reports and credit scores for businesses and consumers from Equifax today! We also have identity protection tools with daily monitoring and alerts

View all jobs at Equifax

Apply now Apply later

Equifax is searching for creative, driven, and high-energy Software Engineers to join our team. We offer the chance to work on meaningful projects with leading-edge technology, collaborating alongside talented engineers. If you're a forward-thinking, committed, and enthusiastic software engineer with a passion for data and modern development practices, you'll thrive in this role.


What you’ll do

  • Support critical enterprise systems, including data applications, financial platforms, and core business data.

  • Design and build efficient data pipelines for seamless integration of new data sources.

  • Manage the full data lifecycle, from extraction and cleansing to ingestion.

  • Contribute to strategic technology planning, translating business needs into actionable technical solutions.

  • Apply strong software engineering fundamentals, providing valuable feedback on release processes, code, and design reviews.

  • Rapidly learn and adapt new technologies, fostering a collaborative and knowledge-sharing environment.

  • Implement modern development practices, such as serverless computing, microservices, CI/CD, and infrastructure-as-code.

What experience you need  

  • Bachelor's degree in Computer Science, Systems Engineering or related field.

  • 6 months to 1 year Data Engineering Fundamentals: Hands-on experience with programming languages like Python, Java, or Scala, coupled with strong proficiency in SQL.

  • Data Pipeline Development: Exposure to or direct involvement in building and managing data pipelines.

  • ETL Processes: Understanding and practical application of Extract, Transform, and Load (ETL) procedures.

  • Big Data Technologies: Familiarity with or project experience using Big Data tools such as Hadoop, Spark, Beam, Hive, Airflow, or similar distributed processing frameworks.

  • Cloud Platforms: Exposure to or hands-on experience with major cloud technologies, specifically GCP or AWS.

What could set you apart

  • GCP Data Engineering Experience: Familiarity with or direct experience using Google Cloud Platform (GCP) data technologies like BigQuery, DataProc, Dataflow, and Composer.

  • Data Security: Experience implementing encryption mechanisms for sensitive data, both in transit and at rest.

  • Diverse Data Handling: Proficiency in working with various data sources and structures, including APIs, databases, JSON, CSV, XML, and text files.

  • Relational Databases: Knowledge of relational databases such as Oracle, PostgreSQL, SQL Server, or MySQL.

  • Version Control: Experience with source code control management systems like SVN/Git and GitHub.

  • Atlassian Tooling: Familiarity with Atlassian tools like JIRA, Confluence, and GitHub.

  • Engineering Design Principles: Foundational knowledge of design patterns, the software engineering development lifecycle, and an awareness of DevOps, SecOps, and FinOps practices.

Primary Location:

CRI-Sabana

Function:

Function - Tech Dev and Client Services

Schedule:

Full time
Apply now Apply later

* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰

Job stats:  1  0  0

Tags: Airflow APIs AWS Big Data BigQuery CI/CD Computer Science Confluence CSV Dataflow Data pipelines Dataproc DevOps Engineering ETL GCP Git GitHub Google Cloud Hadoop Java Jira JSON Microservices MySQL Oracle Pipelines PostgreSQL Python RDBMS Scala Security Spark SQL XML

Region: North America
Country: Costa Rica

More jobs like this