Data Engineer
Yerevan
HCVT
We are a leading CPA firm offering tax planning, audit, accounting service, and business management services to companies and individuals in the Los Angeles area.Apply today and find out why so many come for the challenge—and stay for the experience. We look forward to meeting you!
Our advisory team is dedicated to providing data-driven insights and strategic direction to our clients. We are a dynamic and innovative group that values expertise, collaboration, and a commitment to excellence.
We are seeking an experienced Data Engineer to join our advisory team. The ideal candidate will have 4+ years of experience in data engineering and will be responsible for designing, building, and maintaining our data infrastructure. This role requires a proactive and detail-oriented individual who can work effectively in a fast-paced environment to deliver high-quality data solutions.
Work hours are generally 8:00 – 17:00 with a lunch break. The working hours may be flexible during the day, and you will work with your US team to ensure some of your work hours overlaps with your US team’s schedule.
As a Data Engineer in our Armenia office, you will be responsible for the following:
- Design, develop, and optimize data pipelines and ETL processes from various APIs to support data integration and analysis.
- Collaborate with management, analysts, and other stakeholders to understand data requirements and deliver solutions that meet their needs.
- Ensure data quality, integrity, and security across all data sources and systems.
- Perform data modeling, schema design, and database management.
- Monitor and troubleshoot data pipeline performance and reliability issues.
- Document data engineering processes, standards, and best practices.
- Stay current with emerging trends and technologies in data engineering and recommend improvements to existing systems.
To be successful, these are the skills, qualities and experience you will need:
- Bachelor’s degree in Computer Science, Engineering, or a related field.
- Minimum of 4 years of experience in data engineering or a similar role.
- Proficiency in programming languages such as Python, Java, or Scala.
- Experience with data pipeline tools (e.g., Apache Airflow, FiveTran).
- Significant experience building pipelines from various data sources into a centralized database or data warehouse.
- Experience working with and understanding of REST, SOAP, and GraphQL APIs.
- Strong knowledge of SQL and experience working with relational databases (e.g., PostgreSQL, MySQL).
- Experience with big data technologies (e.g., Hadoop, Spark) and cloud platforms (e.g., AWS, Google Cloud, Azure).
- Understanding of data warehousing concepts and experience with data warehouse solutions (e.g., Snowflake, BigQuery).
- Excellent problem-solving skills and attention to detail.
- Strong communication and collaboration skills.
Preferred Qualifications:
- Master’s degree in Computer Science, Engineering, or a related field.
- Basic knowledge of machine learning frameworks and libraries.
The ordinance requires employers to state, in all job solicitations, postings and advertisements, that the employer will consider applicants in a manner consistent with the requirements of the Fair Chance Initiative.
* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰
Tags: Airflow APIs AWS Azure Big Data BigQuery Computer Science Data pipelines Data quality Data warehouse Data Warehousing Engineering ETL FiveTran GCP Google Cloud GraphQL Hadoop Java Machine Learning MySQL Nonprofit Pipelines PostgreSQL Python RDBMS Scala Security Snowflake Spark SQL
Perks/benefits: Career development Flex hours Team events
More jobs like this
Explore more career opportunities
Find even more open roles below ordered by popularity of job title or skills/products/technologies used.