Sr. Eng. I, Distrib Tech
Hyderabad - Ranga Reddy, India
Invesco
Invesco Corporate is home to our latest company news, financials, and business updates. Learn how we are committed to creating greater possibilities for our clients.As one of the world’s leading asset managers, Invesco is dedicated to helping investors worldwide achieve their financial objectives. By delivering the combined power of our distinctive investment management capabilities, we provide a wide range of investment strategies and vehicles to our clients around the world.
If you're looking for challenging work, smart colleagues, and a global employer with a social conscience, come explore your potential at Invesco. Make a difference every day!
Job DescriptionInvesco understands that data and the products created from data are the lifeblood of our business. Data-first functions work together seamlessly to enable Invesco to achieve the true value of data and data products. Invesco leverages data as a strategic asset by making quality, trusted data and content available to the right people, at the right time, in the right format, in the most efficient way possible to enable both large transformations and day-to-day operations.
This Senior Data Engineer role sits within the Product Engagement part of our Digital, Distribution, and Enterprise Engineering (D2E2) organization. You will maintain good relationships with internal partners and collaborate with peers across the Distribution business domain. You will develop a strong understanding of the investments and distribution data and business processes, including both how it is produced and consumed across all Distribution channels and how to apply Invesco’s Data strategy and practices to drive Distribution’s data capabilities forward.
The focus of this position is to design, develop, and manage our data infrastructure and pipelines. In this role, you will leverage modern data tools including Snowflake, PostgreSQL, DBT, Airflow, and Airbyte to ensure seamless data integration, transformation, and availability for analytics and business intelligence. You will play a key role in building scalable and efficient data systems to support our growing data needs.
You will be responsible for:
- Data Pipeline Development: Build and maintain end-to-end data pipelines using Airbyte for data ingestion and Airflow for workflow orchestration, ensuring reliable data flow from source systems to Snowflake.
- Data Transformation: Implement and optimize data transformation workflows using DBT to create clean, structured, and analytics-ready datasets in Snowflake.
- Database Management: Design, manage, and optimize PostgreSQL databases for operational data storage and integrate them with Snowflake for analytics use cases.
- Snowflake Administration: Configure and maintain Snowflake as the primary data warehouse, including schema design, performance tuning, and cost management.
- Workflow Automation: Use Airflow to schedule, monitor, and troubleshoot data pipeline jobs, ensuring timely and accurate data delivery.
- Data Integration: Leverage Airbyte to connect and sync data from various sources (e.g., APIs, SaaS platforms, databases) into Snowflake and PostgreSQL.
- Data Quality: Implement testing and validation processes within DBT and Airflow to ensure data accuracy, consistency, and reliability.
- Collaboration: Partner with data analysts, data scientists, and business stakeholders to understand requirements and deliver tailored data solutions.
- Performance Optimization: Monitor and optimize pipeline performance, query execution in Snowflake, and resource usage across all tools.
- Documentation: Maintain detailed documentation of pipelines, transformations, and database schemas for team reference and compliance.
- Problem-Solving: Demonstrate strong problem-solving skills.
- Meetings: Attend regular team and other required meetings daily/weekly.
- Communication: Ensure clear and accurate communication and respond to data, business, and technology partners and peers in a timely manner.
This individual:
- Must be comfortable working in a scaled agile development environment.
- Have an ability to handle multiple requests from varied stakeholders, in a way that maintains clear priority, and ability to adjust when needed.
- Be an effective “translator” of business needs into technical requirements.
- Demonstrate an ability to build relationships, collaborate, and influence internal and external teams.
- Be able to work under the guidance of senior team members, take initiative, and get projects completed with great attention to detail and on time.
The experience you bring:
- Education: Bachelor’s degree in Computer Science, Engineering, Data Science, or a related field (or equivalent experience).
- Experience: Minimum 5-6 years of experience with AWS cloud platforms where Snowflake, Airflow, and Airbyte are deployed.
- Relational Databases and Data Warehouse: Minimum 4-7 years of experience in relational databases and data warehousing.
- Programming: Strong scripting skills in programming languages like SQL and Python, as you will be responsible for writing queries and other data transformation scripts.
- Data Knowledge: Solid knowledge of data lifecycle, data governance, data risk, master data management concepts, data modeling, business intelligence, and analytics concepts.
- Teamwork: Ability to work in a team/group setting.
- Problem-Solving: Ability to interpret complex or vague instructions and successfully produce results. Proactively identifies, analyzes, and resolves procedural fail points.
- Judgment and Initiative: Demonstrates superior judgment, reasoning, and follow-up skills. Shows a high level of initiative, assertiveness, and self-confidence.
- Agile Methodology: Strong working knowledge of Agile methodology.
- Communication: Shares information efficiently between team members. Promotes teamwork. Excellent listening, interpersonal, written, and oral communication skills.
- Time Management: Effectively manages multiple responsibilities and meets deadlines.
- Analytical Skills: Excellent analytical, mathematical, and creative problem-solving skills.
- Attention to Detail: Logical and efficient, with attention to detail.
- Data Modeling: An understanding of the concepts related to data modeling, coding, and process flow design.
This role offers an exciting opportunity to work with cutting-edge data technologies and contribute to the development of a robust data infrastructure that supports our organization's analytical and business intelligence needs. If you are passionate about data engineering and thrive in a collaborative environment, we encourage you to apply.
Full Time / Part TimeFull timeWorker TypeEmployeeJob Exempt (Yes / No)YesWorkplace Model
At Invesco, our workplace model supports our culture and meets the needs of our clients while providing flexibility our employees value. As a full-time employee, compliance with the workplace policy means working with your direct manager to create a schedule where you will work in your designated office at least three days a week, with two days working outside an Invesco office.
Why Invesco
In Invesco, we act with integrity and do meaningful work to create impact for our stakeholders. We believe our culture is stronger when we all feel we belong, and we respect each other’s identities, lives, health, and well-being. We come together to create better solutions for our clients, our business and each other by building on different voices and perspectives. We nurture and encourage each other to ensure our meaningful growth, both personally and professionally.
We believe in diverse, inclusive, and supportive workplace where everyone feels equally valued, and this starts at the top with our senior leaders having diversity and inclusion goals. Our global focus on diversity and inclusion has grown exponentially and we encourage connection and community through our many employee-led Business Resource Groups (BRGs).
What’s in it for you?
As an organization we support personal needs, diverse backgrounds and provide internal networks, as well as opportunities to get involved in the community and in the world.
Our benefit policy includes but not limited to:
- Competitive Compensation
- Flexible, Hybrid Work
- 30 days’ Annual Leave + Public Holidays
- Life Insurance
- Retirement Planning
- Group Personal Accident Insurance
- Medical Insurance for Employee and Family
- Annual Health Check-up
- 26 weeks Maternity Leave
- Paternal Leave
- Adoption Leave
- Near site Childcare Facility
- Employee Assistance Program
- Study Support
- Employee Stock Purchase Plan
- ESG Commitments and Goals
- Business Resource Groups
- Career Development Programs
- Mentoring Programs
- Invesco Cares
- Dress for your Day
In Invesco, we offer development opportunities that help you thrive as a lifelong learner in a constantly evolving business environment and ensure your constant growth. Our AI enabled learning platform delivers curated content based on your role and interest. We ensure our manager and leaders also have many opportunities to advance their skills and competencies that becomes pivotal in their continuous pursuit of performance excellence.
To know more about us
About Invesco: https://www.invesco.com/corporate/en/home.html
About our Culture: https://www.invesco.com/corporate/en/about-us/our-culture.html
About our D&I policy: https://www.invesco.com/corporate/en/our-commitments/diversity-and-inclusion.html
About our CR program: https://www.invesco.com/corporate/en/our-commitments/corporate-responsibility.html
Apply for the role @ Invesco Careers: https://careers.invesco.com/india/
* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰
Tags: Agile Airflow APIs AWS Business Intelligence Computer Science Data governance Data management Data pipelines Data quality Data strategy Data warehouse Data Warehousing dbt Engineering Pipelines PostgreSQL Python RDBMS Snowflake SQL Testing
Perks/benefits: Career development Competitive pay Equity / stock options Flex hours Medical leave
More jobs like this
Explore more career opportunities
Find even more open roles below ordered by popularity of job title or skills/products/technologies used.