Data Architect/Engineer Python Developer

Kyiv, Kyiv City

Ajax Systems

Ajax alarm system includes everything you need to protect your property | Professional home security system that combines CCTV, fire alarm and automation

View all jobs at Ajax Systems

Apply now Apply later

Ajax Systems is a full-cycle company working from idea generation and R&D to mass production and sales. We do everything: we produce physical devices (the system includes many different sensors and hubs), write firmware for them, develop the server part and release mobile applications. The whole team is in one office in Kyiv, all technical and product decisions are made locally. We’re looking for a Data Engineer to join us and continue the evolution of a product that we love: someone who takes pride in their work to ensure that user experience and development quality are superb.
Required skills:
Proven experience as a Data Architect or Architect Data Engineer role At least 3 years of experience as a Python DeveloperStrong problem solving, troubleshooting and analysis skills Previous years of experience and a substantial understanding in:  Data ingestion frameworks for real-time and batch processing Development and optimization of relational databases such as MySQL or PostgreSQL Working with NoSQL databases and search systems (including Elasticsearch, Kibana, and MongoDB) Cloud-based object storage systems (e.g. S3-compatible services) Data access and warehousing tools for analytical querying (e.g. distributed query engines, cloud data warehouses)
Will be a plus:
Working with large volumes of data and databases Knowledge of version control tools such as GitEnglish at the level of reading and understanding technical documentation Create complex SQL queries against data warehouses and application databases
Tasks and responsibilities:
Develop and manage large scale data systems and ingestion capabilities and infrastructure. Support Design and development of solutions for the deployment of dashboards and reports to various stakeholders.Architect data pipelines and ETL processes to connect with various data sources Design and maintain enterprise data warehouse models Manage cloud based data & analytics platform Deploy updates and fixesEvaluate large and complex data setsEnsure queries are efficient and use the least amount of resources possible Troubleshoot queries to address critical production issuesAssist other team members in refining complex queues and performance tuningUnderstand and analyze requirements to develop, test and deploy complex SQL queries used to extract business data for regulatory and other purposes; Write and maintain technical documentation.
Apply now Apply later

* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰

Job stats:  0  0  0

Tags: Data pipelines Data warehouse Elasticsearch ETL Kibana MongoDB MySQL NoSQL Pipelines PostgreSQL Python R R&D RDBMS SQL

Region: Europe
Country: Ukraine

More jobs like this