Data Engineer
Ho Chi Minh City, VN, 700000
GFT Technologies
Wir unterstützen unsere Kunden mit modernsten IT-Lösungen, Technologien und Beratung in der digitalen Transformation.About GFT
GFT Technologies is driving the digital transformation of the world’s leading financial institutions. Other sectors, such as industry and insurance, also leverage GFT’s strong consulting and implementation skills across all aspects of pioneering technologies, such as cloud engineering, artificial intelligence, the Internet of Things for Industry 4.0, and blockchain.
With its in-depth technological expertise, strong partnerships and scalable IT solutions, GFT increases productivity in software development. This provides clients with faster access to new IT applications and innovative business models, while also reducing risk.
We’ve been a pioneer of near-shore delivery since 2001 and now offer an international team spanning 16 countries with a global workforce of over 9,000 people around the world. GFT is recognised by industry analysts, such as Everest Group, as a leader amongst the global mid-sized Service Integrators and ranked in the Top 20 leading global Service Integrators in many of the exponential technologies such as Open Banking, Blockchain, Digital Banking, and Apps Services.
Role Summary:
As a Data Engineer at GFT, you will play a pivotal role in designing, maintaining, and enhancing various analytical and operational services and infrastructure crucial for the organization's functions. You'll collaborate closely with cross-functional teams to ensure the seamless flow of data for critical decision-making processes.
Key Activities:
- Data Infrastructure Design and Maintenance: Architect, maintain, and enhance analytical and operational services and infrastructure, including data lakes, databases, data pipelines, and metadata repositories, to ensure accurate and timely delivery of actionable insights.
- Collaboration: Work closely with data science teams to design and implement data schemas and models, integrate new data sources with product teams, and collaborate with other data engineers to implement cutting-edge technologies in the data space.
- Data Processing: Develop and optimize large-scale batch and real-time data processing systems to support the organization's growth and improvement initiatives.
- Workflow Management: Utilize workflow scheduling and monitoring tools like Apache Airflow and AWS Batch to ensure efficient data processing and management.
- Quality Assurance: Implement robust testing strategies to ensure the reliability and usability of data processing systems.
- Continuous Improvement: Stay abreast of emerging technologies and best practices in data engineering, and propose and implement optimizations to enhance development efficiency.
Required Skills:
- 3-8 years of experience as a Data Engineer.
- Technical Expertise: Proficient in Unix environments, distributed and cloud computing, Python frameworks (e.g., pandas, pyspark), version control systems (e.g., git), and workflow scheduling tools (e.g., Apache Airflow).
- Database Proficiency: Experience with columnar and big data databases like Athena, Redshift, Vertica, and Hive/Hadoop.
- Cloud Services: Familiarity with AWS or other cloud services like Glue, EMR, EC2, S3, Lambda, etc.
- Containerization: Experience with container management and orchestration tools like Docker, ECS, and Kubernetes.
- CI/CD: Knowledge of CI/CD tools such as Jenkins, CircleCI, or AWS CodePipeline.
Nice-to-have requirements:
- Programming Languages: Familiarity with JVM languages like Java or Scala.
- Database Technologies: Experience with RDBMS (e.g., MySQL, PostgreSQL) and NoSQL databases (e.g., DynamoDB, Redis).
- BI Tools: Exposure to enterprise BI tools like Tableau, Looker, or PowerBI.
- Data Science Environments: Understanding of data science environments like AWS Sagemaker or Databricks.
- Monitoring and Logging: Knowledge of log ingestion and monitoring tools like ELK stack or Datadog.
- Data Privacy and Security: Understanding of data privacy and security tools and concepts.
- Messaging Systems: Familiarity with distributed messaging and event streaming systems like Kafka or RabbitMQ.
What we offer you:
You will be working with some of the brightest people in business and technology on challenging and rewarding projects in a team of like-minded individuals. GFT prides itself on its international environment that promotes professional and cultural exchange and encourages further individual development.
* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰
Tags: Airflow Athena AWS Banking Big Data Blockchain CI/CD Consulting Databricks Data pipelines Docker DynamoDB EC2 ECS ELK Engineering Git Hadoop Java Jenkins Kafka Kubernetes Lambda Looker MySQL NoSQL Pandas Pipelines PostgreSQL Power BI Privacy PySpark Python RabbitMQ RDBMS Redshift SageMaker Scala Security Streaming Tableau Testing
More jobs like this
Explore more career opportunities
Find even more open roles below ordered by popularity of job title or skills/products/technologies used.