Senior Azure Databricks Engineer
Bengaluru, India
Kongsberg Digital
KONGSBERG is an international, knowledge-based group that supplies high-technology systems and solutions to customers engaged in the oil and gas industry, the merchant marine, and the defence and aerospace industries.We are looking for a Senior Azure Databricks Engineer to support and maintain our internal BI platform, used by our Finance and Business Operations teams. This is a hands-on technical role focused on backend data operations — including data ingestion, transformation, and CI/CD support — within a cloud-based data warehouse environment.
Key Responsibilities:
• Ensure stable operation of the internal BI platform used by Finance and Business Operations
• Develop, maintain, and troubleshoot data pipelines for ingestion, transformation, and load using Azure Databricks (PySpark, SQL).
• Support and optimize CI/CD pipelines (Azure DevOps) for smooth deployments and minimal downtime.
• Collaborate with BI front-end analysts, IT teams, and business stakeholders to ensure
alignment of data needs and delivery.
• Monitor and improve system performance, resolve incidents, and ensure data quality and consistency.
• Maintain data architecture standards and support platform scalability and compliance.
• Integrate data from systems like D365 Finance & Operations and other business
applications.
• Work with Azure services such as Data Lake, Key Vaults, Service Principals, and SQL
Database.
• Maintain proper documentation of processes, configurations, and procedures.
• Participate in improvement initiatives to enhance platform efficiency and usability.
What you need to succeed:
• 7+ years of experience with Business Data Analytics Platforms.
• Strong hands-on experience with Azure Databricks, PySpark, and (SparkSQL or SQL)
• Solid understanding of CI/CD pipelines (preferably with Azure DevOps) and troubleshooting deployment issues.
• Proficiency in Python and working knowledge of Shell scripting.
• Experience with data ingestion, ETL processes, and managing large-scale data pipelines.
• Experience with Azure services such as Azure Key Vaults, Azure SQL, Azure Data Lake, and Service Principals.
• Understanding data governance, security standards, and handling sensitive data.
• Ability to work closely with both IT and finance/business stakeholders.
• Good knowledge of data integration from sources like D365 F&O, Unit4, Azure Portal
• Strong analytical, problem-solving, and communication skills.
• Excellent problem-solving, collaboration, and communication skills.
* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰
Tags: Architecture Azure CI/CD Data Analytics Databricks Data governance DataOps Data pipelines Data quality Data warehouse DevOps ETL Finance Pipelines PySpark Python Security Shell scripting SQL
More jobs like this
Explore more career opportunities
Find even more open roles below ordered by popularity of job title or skills/products/technologies used.