Senior DevOps Engineer for Data Engineering team (m/f/x)
Vienna, Poland
Dynatrace
Innovate faster, operate more efficiently, and drive better business outcomes with observability, AI, automation, and application security in one platform.Company Description
Dynatrace exists to make software work perfectly. Our platform combines broad and deep observability and continuous runtime application security with advanced AIOps to provide answers and intelligent automation from data. This enables innovators to modernize and automate cloud operations, deliver software faster and more securely, and ensure flawless digital experiences.
Job Description
Data = information = knowledge = power. Do you want to hold the keys to that power? Are you motivated by solving challenging problems, where creativity is as crucial as your ability to write code, deliver solutions, and bring valuable data sets together to answer business questions?
If this sounds like an environment where you will thrive, come and join our Data Engineering team. Interested? Cause we are!
About the role
In Dynatrace we are all about automation, self healing and noOps approach. We preach automation wherever possible and we live by what we preach.
In the Data Engineering team for which we are hiring we are providing data to drive the world class Application Intelligence platform that is Dynatrace.
As a DevOps Engineer in the Data Engineering you will help us automate away our Data Platform both by providing the necessary tooling as by designing processes.
It is quite an unique situation, as Dynatrace delivers the one of the best tools for DevOp, with this opportunity you would put your experience to drive this product, dogfooding it whenever possible and building a tool for other DevOps as well.
You will be building tools to automate installation at scale, accelerating time-to-value and enhancing the reliability of the Data Platform. That includes scripts but we may also need to integrate with existing mechanisms via APIs or provide means to reconfigure an already deployed product. Have an impact on how we shape our ETL pipeline and make sure the deployments of new builds of pipeline are automatic, predictable and transparent. All this working towards eliminating data downtimes and adding bricks to building trust in the data that your fellow Dynatracers will use on all levels of seniority to build the product that our customers love.
This is an exciting opportunity to make a direct, tangible impact on our product and work on our crucial Digital Business Platform.
As a member of the Data Engineering team, you will be at the center of Dynatrace product innovation.
In a company as Agile in organization as Dynatrace, there is always an option and encouragement to explore new areas when you find them interesting, moving to new positions and building a career with Dynatrace.
We guarantee plenty of challenges and scope to grow.
Qualifications
Main responsibilities
Creating deployment integrations for cloud platforms, primarily AWS and Azure.
Deployment automation in Jenkins and Terraform
Designing and automating processes for ETL data pipeline(s)
Proactively ensuring continuous and smooth data related processes execution
Collaboration in international cross-lab teams (mostly in the same time zone, across Europe) on the delivery of current objectives.
Priority skills & experience
3+ years professional experience with process automation, preferably as a DevOps, SRE or sysadmin
3+ years working with Cloud solutions, preferably AWS, on configuration, deployment management and automation
Experience with deployment automation, and CI/CD pipelines preferably using Jenkins
Good English communication skills.
Desired skills & experience
Experience with Cloud databases, preferably Snowflake
Experience with DB services administration (PostgreSQL, AWS RDS, Aurora, Snowflake) and practical knowledge of SQL
Practical knowledge of IaC tools: CloudFormation, Terraform, and similar tools.
Mindset focused on monitoring and observability
Nice-to-haves
Experience in CI/CD support for MS Power BI development
Experience with working on data pipelines (ETL / ELT) automation as well as supporting Data Engineering and Data Science team(s)
Experience with multiple cloud platforms (AWS, GCP, Azure)
Good command of scripting language(s): Python, Shell script, PowerShell.
Practical knowledge of IaC tools: Ansible, Chef, Puppet, PowerShell DSC, SaltStack, CloudFormation, Terraform, and similar tools.
Java literacy, experience with other programming languages
Familiarity with Docker and Kubernetes.
Additional Information
What's in it for you?
- A one-product software company creating real value for the largest enterprises and millions of end customers globally, striving for a world where software works perfectly.
- Working with the latest technologies and at the forefront of innovation in tech on scale; but also, in other areas like marketing, design, or research.
- Working models that offer you the flexibility you need.
- A team that thinks outside the box, welcomes unconventional ideas, and pushes boundaries.
- An environment that fosters innovation, enables creative collaboration, and allows you to grow.
- A globally unique and tailor-made career development program recognizing your potential, promoting your strengths, and supporting you in achieving your career goals.
- A truly international mindset that is being shaped by the diverse personalities, expertise, and backgrounds of our global team.
- A relocation team that is eager to help you start your journey to a new country, always there to support and by your side.
- Attractive compensation packages and stock purchase options with numerous benefits and advantages.
Dynatracers come from different countries and cultures all over the world, speaking various languages. English is the one that connects us (55+ nationalities). If you need to relocate for a position you are applying for, we offer you a relocation allowance and support with your visa, work permit, accommodation, language courses, as well as a dedicated buddy program.
Compensation and rewards
We offer attractive compensation packages and stock purchase options with numerous benefits and advantages.
Due to legal reasons, we are obliged to disclose the minimum salary for this position, which is € 56,000 gross per year based on full-time employment. We offer a higher salary in line with qualifications and experience.
Tags: Agile AIOps Ansible APIs AWS Azure CI/CD CloudFormation Data pipelines DevOps Docker ELT Engineering ETL GCP Java Jenkins Kubernetes Pipelines PostgreSQL Power BI Puppet Python Research Security Snowflake SQL Terraform
Perks/benefits: Career development Equity / stock options Relocation support
More jobs like this
Explore more career opportunities
Find even more open roles below ordered by popularity of job title or skills/products/technologies used.