Data Engineer ll

India

Bottomline Technologies

Business payments made better. Customer engagement made excellent.

View all jobs at Bottomline Technologies

Apply now Apply later

Why Choose Bottomline?

Are you ready to transform the way businesses pay and get paid? Bottomline is a global leader in business payments and cash management, with over 30 years of experience and moving more than $10 trillion in payments annually. We're looking for passionate individuals to join our team and help drive impactful results for our customers. If you're dedicated to delighting customers and promoting growth and innovation - we want you on our team!

Position Summary:  

Bottomline is looking for a Data Engineer II to grow with us. 

The data engineer is responsible for designing, developing, and maintaining the infrastructure and systems required for data storage, processing, and analysis. They play a crucial role in building and managing the data pipelines that enable efficient and reliable data integration, transformation, and delivery for all data users across the enterprise.  

The Data Engineer will work on implementing data flows to make data available in the Enterprise Data Warehouse from systems of record and operational data stores.  The Data Engineer will get to work on best-in-class cloud technologies(Snowflake,Fivetran, AWS,Azure,Salesforce, Airflow etc). 

Along with making data available in the Enterprise Data Warehouse the Data Engineer will work with Data Analysts to implement data models and calculate key business KPIs for the use of the wider business for reporting and analytics. 

The Data Engineer will have the opportunity to learn and develop their skills by working on assignments as part of a Scrum Team. They should be delivery focused and driven to problem solve. 

 

How you’ll contribute: 

  • Design and develop data pipelines that extract data from various sources, transform it into the desired format, and load it into the appropriate data storage systems.  
  • Collaborate with data scientists and analysts to optimize models and algorithms for data quality, security, and governance.  
  • Integrate data from different sources, including databases, data warehouses, APIs, and external systems.  
  • Ensure data consistency and integrity during the integration process, performing data validation and cleaning as needed.  
  • Transform raw data into a usable format by applying data cleansing, aggregation, filtering, and enrichment techniques.  
  • Optimize data pipelines and data processing workflows for performance, scalability, and efficiency.  
  • Monitor and tunes data systems, identifies and resolves performance bottlenecks, and implements caching and indexing strategies to enhance query performance.  
  • Implement data quality checks and validations within data pipelines to ensure the accuracy, consistency, and completeness of data.  
  • Take authority, responsibility, and accountability for exploiting the value of enterprise information assets and of the analytics used to render insights for decision making automated decisions and augmentation of human performance.  
  • Collaborate with leaders to establish the vision for managing data as a business asset.  
  • Establish the governance of data and algorithms used for analysis, analytical applications, and automated decision making.  

 

What will make you successful:  

  • A bachelor’s degree in computer science, data science, software engineering, information systems, or related quantitative field 
  • At least four (4) years of work experience in data management disciplines, including data integration, optimization and data quality, or other areas directly relevant to data engineering responsibilities and tasks.  
  • Proven project experience developing and maintaining data warehouses (Snowflake experience is preferable) 
  • Ability to design, build, and deploy data solutions that capture, explore, transform, and utilize data to support AI, ML, and BI  
  • Strong ability in programming languages such as Java or Python and other scripting languages 
  • Previous experience with languages/tools such as SQL  
  • Significant experience working in the ETL process and building pipeline for data retrieval using Rest API’s. Knowledge in ETL tool like Talend, Informatica is preferable 
  • Proficiency in OLAP, Star, Dimensional, and Snowflake schemas. 
  • Basic knowledge of BI Tools – Power BI, Tableau. 
  • Basic knowledge of DevOps tools – Github, Atlassian Tools, VS Code etc 
  • Experience working in a structured development environment (i.e., environment with the standard SDLC process). 
  • Proficiency in the design and implementation of modern data architectures and concepts such as cloud services (AWS, Azure) and modern data warehouse tools (Snowflake, Databricks) 
  • Experience with database technologies such as SQL, NoSQL, Oracle, or Teradata 
  • Knowledge in Apache technologies such as Kafka, Airflow to build scalable and efficient data pipelines (nice to have). 
  • Ability to collaborate within and across teams of different technical knowledge to support delivery and educate end users on data products. 
  • Expert problem-solving skills, including debugging skills, allowing the determination of sources of issues in unfamiliar code or systems, and the ability to recognize and solve repetitive problems. 
  • Excellent business acumen and interpersonal skills; able to work across business lines at a senior level to influence and effect change to achieve common goals. 
  • Ability to describe business use cases/outcomes, data sources and management concepts, and analytical approaches/options. 

  

#LifeAtBottomline 

 

 

We welcome talent at all career stages and are dedicated to understanding and supporting additional needs. We're proud to be an equal opportunity employer, committed to creating an inclusive and open environment for everyone.

Apply now Apply later

* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰

Job stats:  0  0  0
Category: Engineering Jobs

Tags: Airflow APIs Architecture AWS Azure Computer Science Databricks Data management Data pipelines Data quality Data warehouse DevOps Engineering ETL FiveTran GitHub Informatica Java Kafka KPIs Machine Learning NoSQL OLAP Oracle Pipelines Power BI Python REST API Salesforce Scrum SDLC Security Snowflake SQL Tableau Talend Teradata

Perks/benefits: Career development

Region: Asia/Pacific
Country: India

More jobs like this