Senior Software Engineer
Pune
Convera
Convera helps over 26,000 businesses manage FX risk and streamline cross-border payments—securing more value in every transaction.Senior Software Engineer
We are seeking an experienced Senior Software Engineer to oversee the development and utilization of data systems. You will be reporting to the Sr. Manager – Data Engineering, to join our dynamic team in the Foreign Exchange payments processing industry. The ideal candidate is responsible for defining and implementing the enterprise data architecture strategy and ensuring robust data governance across the organization. This role requires a deep understanding of business processes, technology, data management, and regulatory compliance. The successful candidate will work closely with business and IT leaders to ensure that the enterprise data architecture supports business goals, and that data governance policies and standards are adhered to across the organization. Your responsibilities will include working closely with data engineers, analysts, cross-functional teams, and other stakeholders to ensure that our data platform meets the needs of our organization and supports our data-driven initiatives. It also includes building a new data platform, integrating data from various sources, and ensuring data availability for various application and reporting needs. Additionally, the candidate should have experience working with AI/ML technologies and collaborating with data scientists to meet their data requirements.
Motivated by our values: Customer Champions, Growth Minded, Truth Seekers, Fast Movers, High Achievers, Respectfully Candid
We are the largest non-bank B2B cross-border payments company in the world. Formerly Western Union Business Solutions, we leverage decades of industry experience and technology-led payments solutions to deliver smarter money movements to our customers – helping them capture more value with every transaction.
Convera serves more than 30,000 customers ranging from small business owners to enterprise treasurers to educational institutions to financial institutions to law firms to NGOs.
We make moving money so easy, that any company in the world can grow with confidence
Convera is building a world class digital workplace environment for our employees. We are looking for forward-thinking engineers who can architect, build, and support digital workplace platforms and tools. Come join the team setting the standard for the Employee Technology Experience!
This role will lead our Mac endpoint device management activities at Convera. This role requires expert understanding of Mac / Jamf end user device management including laptop imaging and mobile device management solutions. The role also requires understanding and knowledge of SaaS based end user computing tools (Office 365, Box, Zoom, etc).
You will be responsible for:
- Architect and Develop Data Solutions: Lead the end-to-end design and development of robust data pipelines and data architectures using AWS tools and platforms, including AWS Glue, S3, RDS, Lambda, EMR, and Redshift.
- Optimize ETL Processes: Design and optimize ETL workflows to facilitate the efficient extraction, transformation, and loading of data between diverse source and target systems, including data warehouses, data lakes, and both internal and external platforms.
- Collaborate and Design Data Models: Partner with stakeholders and business units to develop data models that align with business needs, analytical requirements, and industry standards.
- Data Integration and Architecture Maintenance: Collaborate with internal and external teams to design, implement, and maintain data integration solutions, ensuring high data integrity, consistency, and accuracy.
- Implementation and Troubleshooting: Oversee the implementation of data solutions from initial concept through to production. Troubleshoot and resolve complex technical issues to ensure data pipeline stability and high performance.
- Leadership and Mentorship: Provide guidance and leadership to engineering teams, promoting a culture of continuous improvement, knowledge sharing, and technical excellence. Mentor junior engineers and foster their professional growth.
- Innovation and Strategy: Drive technical innovation by staying abreast of industry trends and emerging technologies. Influence technical strategies and decisions to align with organizational goals and objectives.
- Documentation and Best Practices: Develop and maintain comprehensive documentation for data architectures, pipelines, and processes. Establish and enforce best practices for data engineering and quality assurance.
A successful candidate for this position should have:
- Bachelor's degree or equivalent in Computer Science, Engineering, or a related field with proven experience in designing, deploying, and managing cloud-based infrastructure, preferably for data platforms
- Minimum of 10+ years of experience in enterprise data architecture, data modeling, data management and data governance, or a related field.
- Strong proficiency in AWS, including data services (a must), compute, storage, networking, and security services
- Proficiency in programming languages such as Python, Java, or Scala, with a focus on data processing frameworks (e.g., Apache Spark, Kafka)
- Expertise in data engineering tools and technologies (e.g., SQL, Python, Spark, Kafka, Snowflake, Databricks, DBT, Airflow etc.).
- Proficiency in cloud platforms (AWS, Azure, GCP) and their data services.
- Knowledge of data architecture, data modeling, and data governance frameworks.
- Familiarity with DevOps, CI/CD pipelines, and dataOps practices.
- Working knowledge of reporting tools like Tableau, PowerBI.
Must Skills Set:
- Expert Level Python or PySpark for complex data engineering tasks
- Advance SQL – performance tuning, query optimization, window functions, CTEs
- Bash Scripting
- Deep Understanding of data lakes, lake houses and warehouse architecture
- Experience with schema evolution, partitioning and metadata management
- Building and optimizing large scale ETL/ELT pipelines by leveraging tools like Apache airflow,dbt, spark, kafka
- Expertise in AWS (S3, Glue, EMR, Redshift, DynamoDB, Lambda)
- Designing and implementing data quality frameworks
- Strong understanding of data governance, lineage and compliance
- CI/CD for data pipelines and version control Git/Github.
- Snowflake and AWS Certification are good to have.
#LI-KP1
* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰
Tags: Airflow Architecture AWS AWS Glue Azure CI/CD Computer Science Databricks Data governance Data management DataOps Data pipelines Data quality dbt DevOps DynamoDB ELT Engineering ETL GCP Git GitHub Java Kafka Lambda Machine Learning Pipelines Power BI PySpark Python Redshift Scala Security Snowflake Spark SQL Tableau
Perks/benefits: Gear Startup environment Team events
More jobs like this
Explore more career opportunities
Find even more open roles below ordered by popularity of job title or skills/products/technologies used.