Python Engineer 2
Poznan, Poland
Full Time Senior-level / Expert PLN 14K - 19K
Job Description
The salary range for this position is PLN 14 200 - 19 700 (contract of employment).
A hybrid work model that incorporates solutions developed by the leader and the team.
We are looking for people who:
- Know Python or want to learn it (Python is easy!) and know another language well;
- have solid knowledge of GCP services (Compute Engine, Cloud SQL, BigQuery), Azure, and data analysis tools (BigQuery);
- have experience in cost management in a GCP and/or Azure environment;
- are not afraid of tasks which are between devops and programming (such as CI/CD configuration, k8s cluster problem analysis, configuration of support tools such as Kibana, Graphite, Logstash);
- pay attention to code quality and follow best programming practices;
- know the English language on at least B2 level.
The following are also a plus:
- knowledge of TDD methodology;
- knowledge of Django;
- experience in data analysis, financial modeling, budgeting and/or forecasting;
- understanding of billing on cloud platforms like GCP (incl. impact of usage patterns on TCO).
In your daily work you will handle the following tasks:
- You will work with a team of FinOps professionals that use business data and statistical methods to provide insight into operational business performance;
- you will support the implementation and administration of FinOps tooling that provides cost transparency and optimization for cloud and on-premises infrastructure;
- you will automate cloud and on-premises cost optimization processes through the implementation of rules, forecasts and recommendations;
- you will create data pipelines to transform raw cost data into a structured format suitable for analysis and implement data validation as well as cleansing processes to ensure data accuracy and reliability;
- you will be able to gain knowledge in FinOps Engineering: development, operations, resources optimisation, cost allocation and governance;
- you will gain skills in software development based on TDD, queue work, pubsub, microservices and many design patterns on which the clean code philosophy is based.
What we offer
- A hybrid work model that you will agree on with your leader and the team. We have well-located offices (with fully equipped kitchens and bicycle parking facilities) and excellent working tools (height-adjustable desks, interactive conference rooms)
- Annual bonus up to 10% of the annual salary gross (depending on your annual assessment and the company's results)
- A wide selection of fringe benefits in a cafeteria plan – you choose what you like (e.g. medical, sports or lunch packages, insurance, purchase vouchers)
- English classes that we pay for related to the specific nature of your job
- 16" or 14" MacBook Pro with Apple Silicon processor and 36GB RAM, or a corresponding Dell with Windows (if you don’t like Macs) and other gadgets that you may need
- Working in a team you can always count on — we have on board top-class specialists and experts in their areas of expertise
- A high degree of autonomy in terms of organizing your team’s work; we encourage you to develop continuously and try out new things
- Hackathons, team tourism, training budget and an internal educational platform, MindUp (including training courses on work organization, means of communications, motivation to work and various technologies and subject-matter issues)
- If you want to learn more, check it out
Why is it worth working with us
- We use, among others, Java 17, Kotlin, Coroutines, Scala, Spring, Reactive Programming, Spark, Dataproc, Microservices architecture supporting high request rates on our business data bus. Utilizing extensive Big Data resources on GCP and incorporating Machine Learning into our operational workflows.
- The IT team is made up of over 1700 members who have shared their knowledge at multiple conferences, such as DevDays or Devoxx, and co-create a blog: allegro.tech
- Microservices – a few thousand microservices and 1.8m+ rps on our business data bus
- Big Data – several petabytes of data and Machine Learning used in production
- We practice Code Review, Continuous Integration, Scrum/Kanban, Domain Driven Design, Test Driven Development, Pair Programming, depending on the team
- Our internal ecosystem is based on self-service and widely used tools, such as Kubernetes, Docker, Consul, GitHub or GitHub Actions. This will allow you, from day one, to develop software using any language, architecture and scale, restricted only by your creativity and imagination.
- To match the scale, we also focus on building entire Platforms of tools and technologies that accelerate and facilitate day-to-day development, and we ensure the best Developer Experience to our teams
- Technological autonomy: you get to choose which technology solves the problem at hand (no need for management’s consent). You are responsible for what you create
- Our deployment environment combines private Data Centers (tens of thousands of servers) and Public Clouds (Google Cloud and Microsoft Azure)
- Over 100 original open source projects and a few thousand stars on github
- We organize Allegro Tech Live event, a 100% remote version of our offline Allegro Tech Talks meetups, and we make guest appearances at the invitation of such communities as Warsaw AI, JUG (Poznań, Łódź, Lublin, Wrocław), WG .Net, Dare IT, Women in Tech Summit
- We focus on development as well. We organize hackathons and internal conferences (e.g. the annual Allegro Tech Meeting), our employees regularly participate in events both in Poland and abroad (Europe and USA), and each team has its own budget for training and study aids. If you want to keep growing and share your knowledge, we will always support you
This may also be of interest to you:
Allegro Tech Podcast → https://podcast.allegro.tech/
Booklet → https://jobs.allegro.eu/pl/obszary-prac/tech-data/
Send in your CV and see why it is #goodtobehere!
Tags: Architecture Azure Big Data BigQuery CI/CD Data analysis Data pipelines Dataproc DevOps Django Docker Engineering GCP GitHub Google Cloud Graphite Java Kanban Kibana Kubernetes Logstash Machine Learning Microservices Open Source Pipelines Python Scala Scrum Spark SQL Statistics TDD
Perks/benefits: Career development Conferences Gear Health care Lunch / meals Salary bonus Team events
More jobs like this
Explore more career opportunities
Find even more open roles below ordered by popularity of job title or skills/products/technologies used.