Data Engineer II - Mexico
Mexico Office
Western Governors University
Western Governors University is an online university where you can earn an affordable, accredited, career-focused college degree at an accelerated pace.If you’re passionate about building a better future for individuals, communities, and our country—and you’re committed to working hard to play your part in building that future—consider WGU as the next step in your career.
Driven by a mission to expand access to higher education through online, competency-based degree programs, WGU is also committed to being a great place to work for a diverse workforce of student-focused professionals. The university has pioneered a new way to learn in the 21st century, one that has received praise from academic, industry, government, and media leaders. Whatever your role, working for WGU gives you a part to play in helping students graduate, creating a better tomorrow for themselves and their families.
Develops and builds ETL/ELT data pipelines for use in data analysis.
Creates and maintains optimal data pipeline architecture.
Keeps data separated and secure across multiple cloud environments.
Assembles large, complex data sets that meet functional and non-functional business requirements.
Delivers ad-hoc and analytical reports to internal users and teams.
Monitors and maintains ETL/ELT jobs and troubleshoots load issues.
Manages change requests/ticket queues for analytical reports and ETL/ELT jobs.
Performs data/technology discovery from new sources and third-party applications for data ingestion.
Creates complex reports and dashboards in Cognos and Tableau.
Ingests and transforms structured, semi-structured, and unstructured data from sources including relational databases, NoSQL, external APIs, JSON, XML, delimited files, and more.
Works and delivers in agile methodology for new development projects. Delivers efficient and effective solutions on time.
Analyzes and understands data sources and designs a data model for data capture and ETL/ELT.
Identifies bugs, applies fixes, and checks data quality via process/pipeline audits.
Uses industry best practices for code development, testing, implementation, and documentation.
Performs other job-related duties as assigned.
Knowledge, Skills, and Abilities
Excellent verbal and written communication skills, along with technical documentation skills.
Ability to work with team members, as well as cross-team for product delivery.
Ability to work in an agile environment with timely delivery of ETL/ELT pipelines and reports.
Knowledge and experience using tools like Jira, Confluence, and GitHub.
Ability to develop processes for the audit of data integrity.
Knowledge and experience with validation and testing development to analyze and debug issues.
Experience with relational SQL and NoSQL databases.
Knowledge and experience with object-oriented/object function scripting languages, like Python, Java, and Scala.
Knowledge and experience with big data tools, like Databricks, Hadoop, Spark, Kafka, etc.
Exposure to analytical reporting tools, preferably Power BI and Tableau.
Minimum Qualifications
4 years of experience in Data Engineering, Data Integration, Big Data, Business Intelligence, or Software Engineering.
Bachelor's Degree in Management Information Systems, Computer Science, or a related field.
Equivalent relevant experience performing the essential functions of this job may substitute for education degree requirements. Generally, equivalent relevant experience is defined as 1 year of experience for 1 year of education, and is at the discretion of the hiring manager.
Physical Requirements:
Prolonged periods of sitting at a desk and working on a computer.
Must be able to lift up to 15 pounds at times.
Location: Guadalajara
#LI-AQ1
Learn more about our WGU Mexico Team by clicking here.
* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰
Tags: Agile APIs Architecture Big Data Business Intelligence Computer Science Confluence Data analysis Databricks Data pipelines Data quality ELT Engineering ETL GitHub Hadoop Java Jira JSON Kafka NoSQL Pipelines Power BI Python RDBMS Scala Spark SQL Tableau Testing Unstructured data XML
Perks/benefits: Career development
More jobs like this
Explore more career opportunities
Find even more open roles below ordered by popularity of job title or skills/products/technologies used.