2024-7149 Data Engineer
Taguig, Philippines
Arch Global Services (Philippines) Inc.
Company Description
AGSI was incorporated in April 2016. We are committed to supporting the goals of Arch divisions through exceptional service delivery. We pride ourselves on maintaining flexibility and responsiveness to adapt to business unit and industry demands while focusing on sound project management. We are dedicated to growing and developing our employees as we build strong teams with strategic leadership.
Job Description
Have you ever wondered how insurance companies calculate the price of insurance policies?
The Corporate Actuarial Data & Technology Team is a dynamic, growing team at Arch. This team implements innovative insurance pricing models and analytical tools to improve profitability and growth across a wide range of Property & Casualty insurance products.
The Data Engineer will implement complex databases and data pipelines that put analytics at the heart of real-time decision-making. They will work closely with leaders across the company on high-profile analytics projects that drive business strategies. They provide technical guidance and ensure the use of sound engineering practices and effective use of resources. As a key member of the Corporate Actuarial Data & Technology data engineering team, they will advance our implementation capabilities and help our team deliver solutions faster.
Job Responsibilities
- Work with strategic partners to solve business problems by developing best-in-class data solutions
- Perform data manipulation using SQL and Python to develop data assets
- Leverage technology to automate data ingestion, transformation and integration
- Provide the appropriate level of documentation of sources and technical solutions
- Evaluate data quality and completeness
- Build strong partnerships with peers across the organization to support data-related goals
- Explore new technologies and data sources with curiosity and creativity
- Identify, design, and implement internal process improvements: automating manual processes and optimizing data pipelines
- Discover and explore new sources of data with curiosity and creativity
- Create and maintain optimal data pipeline architecture
Qualifications
Required Skills/Experience
- The ideal candidate has 1-2+ years of data/software engineering experience working with SQL, Python, and relational databases.
- Understanding and experience with dimensional data modeling (star schema)
- Familiarity with the MLOps framework
- Strong data manipulation skills for analytics
- Excellent critical thinking skills to tackle complex data challenges
- Strong organizational skills and very detail oriented
- Exceptional collaboration and relationship building skills
- Comfortable working in a fast paced, highly collaborative environment
- Ability to effectively communicate to different target audiences
- Resilient problem solving and critical thinking skills
- Flexibility - able to meet changing requirements and priorities
Technical Skills Proficiency Level Required (R) /Optional (O)
SQL 3 R
Python 3 R
Git 2 R
Microsoft Excel 2 R
Data Modeling
(Dimensional modeling) 2 O
Cloud Experience
(Azure preferred, AWS/GCP acceptable) 2 O
Snowflake 2 O
Databricks 2 O
Data warehouse architecture 2 O
Microsoft Office Products
(MS Teams, Outlook, Word,
PowerPoint, etc.) 2 O
Additional Information
- BS in Computer Science, Information Technology degree, Data Analytics, Management Information Systems or equivalent
* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰
Tags: Architecture AWS Azure Computer Science Data Analytics Databricks Data pipelines Data quality Data warehouse Engineering Excel GCP Git MLOps Pipelines Python R RDBMS Snowflake SQL
Perks/benefits: Startup environment
More jobs like this
Explore more career opportunities
Find even more open roles below ordered by popularity of job title or skills/products/technologies used.