Senior Analytics Engineer
Hyderabad, India
IQ-EQ
Our know how and know you allows us to provide a comprehensive range of compliance, administration, asset and advisory services to investment funds, globalCompany Description
About IQ-EQ:
IQ-EQ is a leading investor services group that brings together that rare combination of global expertise and a deep understanding of the needs of clients. We have the know-how and the know you that allows us to provide a comprehensive range of compliance, administration, asset and advisory services to investment funds, global companies, family offices and private clients globally.
IQ-EQ employs a global workforce of 5,000+ people located in 23 jurisdictions and has assets under administration (AUA) exceeding US$500 billion. IQ-EQ works with eight of the top 10 global private equity firms.
This is an exciting time to be part of IQ-EQ, we are growing substantially and currently seeking talented Data individuals to come along for the journey into the Data, Analytics and Reporting Department. You will have the opportunity to utilise IQ-EQ's leading-edge technology stack whilst enjoying our continuous learning and development programme.
Job Description
What does the Senior Analytics Engineer opportunity look like for you?
You will play a pivotal role in the development and maintenance of interactive client, investor and operational team facing dashboards using Tableau, Power BI and other visual analytics tools. You will work closely with clients and senior stakeholders to capture their BI requirements, conduct data analysis using SQL, Python (or other open-source languages), and visualise the insights in an impactful manner.
In order to be successful in this role we require the following experience
- Ability to apply advanced working knowledge of conceptual and physical BI implementations to demonstrates they are an expert within the area of Analytics and BI
- Adept at interacting directly with external clients in a verbal manner as well as having the interpersonal savvy to pick up on the nuances within the conversation from one client to the next and through those interactions be able to conceptualise what the overall end-to-end database architecture and data model would look like from the source to the destination of the data
- Advanced hands-on experience of working with and querying both structured and unstructured data within data warehouses and data lakes both on-prem and in cloud (Microsoft SQL Server, Azure, AWS or GCP)
- at least 8 yrs. of demonstrable advanced experience in SQL Server and cloud-based data stores to implement complex and robust solutions
- Advanced hands-on experience of working with SQL Server Data Tool such as SSIS or Azure Data Factory (at least 8 yrs. of demonstrable experience)
- Advanced hands-on experience of ETL / ELT methods such as incremental and full load as well as advanced experience of the various tools to implement these methods such as Azure Data Factory, Azure DataBricks, SSIS, Python, dbt, airflow, Alteryx to cater for complex data flows
- Advanced hands-on experience of implementing dimensional data models within analytics databases / data warehouses
- Advanced hands-on experience of working with Python / Spark packages for analytical data modelling, data analysis and data flows (panda, NumPy, scikit, etc.) as well as data visualisation (matplotlib, plotly, dash, etc.) to solve complex problems
- Advanced hands-on experience of BI tools – Power BI, Tableau, etc., (at least 7 yrs. of demonstrable experience) showcasing advanced storytelling and deep insights
- Advanced experience of various java libraries for front end development and embedding of visuals (i.e. D3, react, node, etc.)
- Ability to take a broad perspective to identify solutions and work independently with guidance in only the most complex situations
- Ability to guide others in resolving complex issues
Tasks (what does the role do on a day-to-day basis)
- Engaging with external clients, vendors, project reps, internal stakeholders from operations as well as client services teams to understand their analytics and dashboard requirements and interpret them into technical specifications
- Creating and maintaining high-level database and model architecture of the BI solution recommending best practice
- Conducting in-depth exploratory data analysis and define business critical features
- Working with key teams within Group Technology to get appropriate access and infrastructure setup to develop advanced visualisations (BI dashboards)
- Creating and optimize SQL Server stored procedures, views, queries as well as queries from multiple data sources to efficiently and effectively retrieve data from databases, employing the fundamentals of dimensional data modelling
- Designing and creating advanced and robust Data Solutions to develop best practice data warehouse solutions that support business analytics asks
- Creating, maintaining and document the different analytics solution processes created per project worked on
- Resolving IT and Data Issues. When database issues arise or support requests come in through Azure DevOps or the help desk, BI Developers are expected to work to resolve these problems. This requires a good understanding of legacy solutions and issues.
- Ensuring updates to Azure DevOps items, help desk tickets and work in progress are well communicated and escalations regarding the support being provided is kept to a minimum
Key competencies for position and level (see Group Competency model)
- Analytical Reasoning – Ability to identify patterns within a group of facts or rules and use those patterns to determine outcomes about what could / must be true and come to logical conclusions
- Critical Thinking – Ability to conceptualise, analyse, synthesise, evaluate and apply information to reach an answer or conclusion
- Conceptual Thinking and Creative Problem Solving - Original thinker that has the ability to go beyond traditional approaches with the resourcefulness to adapt to new / difficult situations and devise ways to overcome obstacles in a persistent manner that does not give up easily.
- Interpersonal Savvy – Relating comfortably with people across all levels, functions, cultures & geographies, building rapport in an open, friendly & accepting way.
- Effective Communication – Adjusting communication style to fit the audience & message, whilst providing timely information to help others across the organisation. Encourages the open expression of diverse ideas and opinions.
- Results / Action Orientated and Determination – Readily taking action on challenges without unnecessary planning and identifies new opportunities, taking ownership of them with a focus on getting the problem solved.
Key behaviours we expect to see
In addition to demonstrating our Group Values (Authentic, Bold, and Collaborative), the role holder will be expected to demonstrate the following:
- Facilitate open and frank debate to drive forward improvement
- Willingness to learn, develop, and keep abreast of technological developments
- An analytical mind, excellent problem-solving & diagnostic Skills, attention to details
Qualifications
Required Experience
Education / professional qualifications
- Degree level education in Data Analytics or Computer Science is preferred but equivalent professional IT certification is acceptable.
Background experience
- A minimum of 8 years experience in a (senior) developer / engineer role or similar DB experience.
- Experience in leading teams or projects
- Deep understanding of dimensional data modelling methodologies
- Advanced experience with visualization and reporting tools namely Tableau and Power BI as well as Qlikview, Looker, ThoughtSpot
- Experience with Microsoft Fabric Platform
- Experience with MS Excel including PowerPivot
Technical
- Advanced experience of developing and supporting a variety of SQL based applications.
- Hands on experience with SQL 2016 and above
- Experience with T-SQL and the ability to analyse queries for efficiency.
- Experience with MS SQL Server suite, including SSIS
- Experience in Fabric Data Factory, Azure Data Factory, Azure Synapse
- Experience in both batch (incremental and full load) and near-real time data ETL / ELT data processing
- Experience with version control software e.g. Git, Bitbucket as well as software development platforms such Azure DevOps and Jira
Languages
- Fully proficient in spoken and written English
* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰
Tags: Airflow Architecture AWS Azure Bitbucket Business Analytics Computer Science D3 Data analysis Data Analytics Databricks Data warehouse dbt DevOps EDA ELT ETL Excel GCP Git Java Jira Looker Matplotlib MS SQL NumPy Open Source Plotly Power BI Python QlikView React Scikit-learn Spark SQL SSIS Tableau T-SQL Unstructured data
Perks/benefits: Career development
More jobs like this
Explore more career opportunities
Find even more open roles below ordered by popularity of job title or skills/products/technologies used.