Senior BI Developer
Remote - Colombia
Ansira
Fuel growth across your distributed networks with an industry-defining platform designed to synchronize your partner ecosystem.The Senior BI Developer is a self-starter with a strong desire to learn and work with cloud-native technologies & processes, improve efficiency along the way and make an impact while contributing to cross-functional teams. You strongly believe in using data to achieve our company’s and clients goals. You own the design, development, and maintenance of all aspects of Ansira’s Analytics and BI solutions, including Data Ingestion, Data Quality, Visualization, Documentation, and Support. You will collaborate with stakeholders and clients to perform deep dive analysis of key business trends from different perspectives and package the insights into easily consumable presentations, dashboards and reports.
You are expected to contribute more than just code. You’ll be involved in defining how things work, what they do, and why we do that instead of something else. We also expect you to share your knowledge and expertise with everyone else. Your ability to creatively collaborate and execute team goals will affect scalability and directly contribute to the company's product and the features our team builds. You will collaborate with product, engineering and other development teams in Ansira to build cloud-native solutions using modern data technologies in a dynamic and agile environment.
You will be part of a fun, diverse team that seeks challenges, loves learning and values teamwork. You will have opportunities for learning, mentorship, career growth, and work on high-business impact areas.
Responsibilities:
- Contribute to the full development life cycle of features and products in our SaaS Platform aiming to meet or exceed customer SLAs.
- Participate in the design, development and implementation of large-scale distributed systems using cloud-native principles and technologies.
- Participate in the design, development and implementation of applications and services able to process large volumes of data, focusing on security, scalability, latency, and resiliency.
- Analyze business requirements; and translate them into Data Warehouse/Business Intelligence and reporting solutions.
- Consult on BI capabilities and recommend and develop solutions to address business needs and performance requirements.
- Extend the Data Warehouse with data from new sources, applying data processing standards and best practices.
- Design and build BI dashboards in short time frames via rapid prototyping and agile development.
- Perform rigorous data analysis to proactively identify any inconsistencies or data quality issues. Provide recommendations for improvements.
- Develop a strong understanding of different data sources, and strategically implement data flows for robustness and scalability.
- Identify development needs in order to improve and streamline operations.
- Identifying cross-domain opportunities arising from the data.
- Write scalable, performant, readable and tested code following standards and best coding practices.
- Develop test strategies, use automation frameworks, write unit/functional tests to drive up code coverage and automation metrics.
- Participate in code reviews and provide meaningful feedback that helps other developers to build better solutions.
- Present your own designs to other development teams, engineering or stakeholders and review designs of others.
- Contribute relevant, clean, concise and quality documentation to Ansira's knowledge base to support/increase information sharing within the organization.
- Learn about Ansira’s business, master our development process, culture and code base, then improve it.
- Establish strong working relationships at all organizational levels and across functional teams.
- Collaborate with the interna/external stakeholder and product team to gather functional and non-functional requirements and identify the business requirements.
- Work closely with product owners and a wide variety of stakeholders to analyze and break down large requirements into small, simple, workable deliverables.
- Ability to work in a fast paced environment and deliver incremental value iteratively and continuously.
- Take responsibility and ownership of product timelines and deliverables
Qualifications:
- Bachelor's or Master’s degree in computer science, computer science engineering, statistics, math, related field, or equivalent experience
- 5+ years of hands on experience in in analyzing large, multi-dimensional data sets and synthesizing insights into actionable solutions
- 5+ years of hands on experience in designing, developing and implementing an enterprise DW/BI system.
- 5+ years of hands on experience in developing and running ETL or ELT processes.
- 5+ years of hands on experience in analyzing and optimizing queries and production workloads.
- Expertise in SQL/PLSQL, data manipulation, query development and optimization.
- Expertise in DWH/Data Lake design, modeling and development.
- Expertise developing high-concurrency, high-volume OLAP systems.
- Expertise using ETL or ELT to ingest data from multiple sources.
- Expertise consuming web-services (REST, SOAP).
- Expertise troubleshooting and resolving performance issues at the database and application levels.
- Expertise using and creating Data Model Diagrams and Data dictionaries.
- Proficiency in caching, data replication and data partition.
- Proficiency running workloads in containers (Docker) or Kubernetes.
- Proficiency in development for web-based and web-enable business applications.
- Proficiency in analyzing production workloads and developing strategies to run data systems with scale and efficiency.
- Proficiency in OWASP security principles, understanding accessibility, and security compliance.
- Proficiency in data security and data protection strategies.
- Proficiency in UML or C4 models.
- Proficiency in Unix and command line tools.
- Proficiency designing, building and deploying scalable, highly available analytics or BI solutions.
- Experience with Test Driven Development (TDD) or experience with automated testing including unit, functional, stress and load testing.
- Experience with Continuous Integration, Continuous Delivery and DevSecOps best practices.
- Experience in the entire Software Development Life Cycle (SDLC), Agile Development, SCRUM, or Extreme Programming methodologies
- A passion for solving problems and providing workable solutions while demonstrating the flexibility to learn new technologies that meet business needs.
- Strong communication skills (English) as well as experience in mentoring and educating your peers.
Preferred Knowledge/Skills :
- Expertise in one or more RDBMS such as PostgreSQL, MySQL, Oracle, SQL Server, etc. Emphasis on PostgreSQL.
- Expertise developing queries and stored procedures in SQL, PLSQL or T-SQL
- Expertise in one or more DWH platforms such as BigQuery, Snowflake, Redshift, Cloudera, Azure Data Lake Store, etc. Emphasis in BigQuery.
- Expertise in data visualizations techniques using tools such as PLX Dashboards, Google Data Studio, Looker, Tableau, or similar technologies. Emphasis in Looker.
- Proficiency in one or more Programming languages such as Java, PHP, Python, Go, etc. Emphasis on Java and Python.
- Proficiency in one or more ETL/ELT tools such as Spring Cloud Data Flow, Google Dataflow, Apache Bean, Adobe Airflow, etc. Emphasis in Spring Cloud Data Flow.
- Competency in one or more Version Control Systems such as Git, SVN, CVS, Team Foundation. Emphasis on Git.
- Competency in one or more observability tools such as Apache Skywalking, Prometheus, Grafana, Graylog, and StackDriver.
- Competency in one or more public cloud providers (AWS, Azure, GCP, etc). Emphasis on Google Cloud Platform.
- Fluency in one or more distributed or NoSQL databases such as CockroachDB, MongoDB, Cassandra, Couchbase, DynamoDB, Redis, etc.
- Fluency in cloud object storage such as S3, GCS. Emphasis on GCS.
- Fluency in HTML and JavaScript.
It’s a plus if you have any of the following skills:
- Experience in statistical modeling using tools such as R, SciPy, SAS
- Experience in one or more large-scale streaming frameworks such as Apache Spark, Apache Storm, Apache Flink, Hadoop, etc.
- Experience with Machine Learning.
* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰
Tags: Agile Airflow AWS Azure BigQuery Business Intelligence Cassandra CockroachDB Computer Science Data analysis Dataflow Data quality Data Studio Data warehouse Distributed Systems Docker DynamoDB ELT Engineering ETL Flink GCP Git Google Cloud Grafana Hadoop Java JavaScript Kubernetes Looker Machine Learning Mathematics MongoDB MySQL NoSQL OLAP Oracle PHP PostgreSQL Prototyping Python R RDBMS Redshift SAS SciPy Scrum SDLC Security Snowflake Spark SQL Statistical modeling Statistics Streaming Tableau TDD Testing T-SQL
Perks/benefits: Career development Startup environment
More jobs like this
Explore more career opportunities
Find even more open roles below ordered by popularity of job title or skills/products/technologies used.