Big Data Engineer - Entry Level

CRI-Sabana

Equifax

Get credit reports and credit scores for businesses and consumers from Equifax today! We also have identity protection tools with daily monitoring and alerts

View all jobs at Equifax

Apply now Apply later

 Equifax is seeking a Big Data Engineer to become part of the Corporate Services Alliance implementing and supporting edge analytical solutions on a Google Cloud Ecosystem. You will find a great place to work if you are passionate about designing data ingestion jobs, learning new technologies, and proposing and adopting new technologies. 

 

What you’ll do

  • Ability to design requirements on small systems or modules of medium systems (large scale) environment and technical documentation.

  • Apply basic principles of software engineering and can follow instructions. Provide meaningful feedback on the release process, code review, and design review.

  • Easily absorbs and applies new information. Displays a cooperative attitude and shares knowledge.

  • Apply modern software development practices (serverless computing, microservices architecture, CI/CD, infrastructure-as-code, etc.). Work across teams to integrate our systems with existing corporate product platforms

  • Participate in technology roadmap and architecture discussions to turn business requirements and vision into reality.

  • Participate in a tight-knit engineering team employing agile software development practices. Leverage automation within scope of effort

What experience you need 

  • Bachelor's degree in Computer Science, Systems Engineering or equivalent experience

  • < 1  Experience/Knowledge in Data Engineering using programming languages such as Python, Java or Scala and SQL (is a must)

  • < 1 Experience/Knowledge in Data Pipelines

  • < 1 Experience/Knowledge with ETL (Extract, Transform, and Load) procedures

  • < 1 Experience/Knowledge in Big Data Technologies such as Hadoop, Spark, Beam, Hive, Airflow or equivalent

  • < 1  Experience/Knowledge in Cloud Technologies such as GCP or AWS

What could set you apart

  • Data Engineering using GCP Technologies (BigQuery, DataProc, Dataflow, Composer, etc)

  • Experience using encryption mechanism for sensitive data in transit and at rest

  • Working with multiple data sources and structures such as API, Database, JSON, CSV, XML, Text files, etc

  • Relational databases (e.g. Oracle, PostgreSQL, SQL Server, MySQL)

  • Source code control management systems (e.g. SVN/Git, Github)

  • Agile environments (e.g. Scrum, XP)

  • Atlassian tooling (e.g. JIRA, Confluence, and Github)

  • Automated Testing: JUnit, Selenium, LoadRunner, SoapUI

  • Cloud Certification Strongly Preferred 

#LI-DU1
#LI-Hybrid

Primary Location:

CRI-Sabana

Function:

Function - Tech Dev and Client Services

Schedule:

Full time
Apply now Apply later
  • Share this job via
  • 𝕏
  • or

* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰

Job stats:  4  1  0

Tags: Agile Airflow APIs Architecture AWS Big Data BigQuery CI/CD Computer Science Confluence CSV Dataflow Data pipelines Dataproc Engineering ETL GCP Git GitHub Google Cloud Hadoop Java Jira JSON Microservices MySQL Oracle Pipelines PostgreSQL Python RDBMS Scala Scrum Selenium Spark SQL Testing XML

More jobs like this