Analytics Engineer

Hyderabad, Telangana, India

IQ-EQ

Our know how and know you allows us to provide a comprehensive range of compliance, administration, asset and advisory services to investment funds, global

View all jobs at IQ-EQ

Apply now Apply later

Company Description

  • IQEQ is a preeminent service provider to the alternative asset industry. IQEQ works with managers in multiple capacities ranging from hedge fund, private equity fund, and mutual fund launches; private equity fund administration; advisory firm set-up, regulatory registration and infrastructure design; ongoing regulatory compliance (SEC, CFTC, and 40 Act); financial controls and operational support services; compliance and operational related projects and reviews; and outsourced CFO/controller and administration services to private equity fund investments – portfolio companies, real estate assets and energy assets. Our client base is growing, and our existing clients are engaging the firm across the spectrum of our services offerings.

Job Description

About IQ-EQ:

  • IQ-EQ is a leading investor services group that brings together that rare combination of global expertise and a deep understanding of the needs of clients. We have the know-how and the know you that allows us to provide a comprehensive range of compliance, administration, asset and advisory services to investment funds, global companies, family offices and private clients globally.
  • IQ-EQ employs a global workforce of 5,000+ people located in 23 jurisdictions and has assets under administration (AUA) exceeding US$500 billion. IQ-EQ works with eight of the top 10 global private equity firms.
  • This is an exciting time to be part of IQ-EQ, we are growing substantially and currently seeking talented Data individuals to come along for the journey into the Data, Analytics and Reporting Department. You will have the opportunity to utilise IQ-EQ's leading-edge technology stack whilst enjoying our continuous learning and development programme.

What does the Analytics Engineer opportunity look like for you?

  • You will play a pivotal role in the development and maintenance of interactive client, investor and operational team facing dashboards using Tableau, Power BI and other visual analytics tools. You will work closely with clients and senior stakeholders to capture their BI requirements, conduct data analysis using SQL, Python (or other open-source languages), and visualise the insights in an impactful manner.

In order to be successful in this role we require the following experience

  • Experience of interacting directly with external clients in a verbal manner and through those interactions be able to conceptualise what the overall end-to-end database architecture and data model would look like from the source to destination of the data
  • Intermediate experience of working with structured and unstructured data, data warehouses and data lakes both on-prem and in cloud (Microsoft SQL Server, Azure, AWS or GCP)
  • At least 5 yrs. of demonstrable experience in SQL Server and cloud-based data stores.
  • Intermediate knowledge of SQL Server Data Tool such as SSIS or Azure Data Factory (at least 5 yrs. of demonstrable experience)
  • Intermediate experience of ETL / ELT methods such as incremental and full load as well as various tools to implement these methods such as Azure Data Factory, Azure DataBricks, SSIS, Python, dbt, airflow, Alteryx
  • Intermediate experience of implementing dimensional data models within analytics databases / data warehouses
  • Intermediate knowledge of Python / Spark packages for analytical data modelling and data analysis (panda, NumPy, scikit, etc.) as well as data visualisation (matplotlib, plotly, dash, etc.)
  • Intermediate experience of BI tools – Power BI, Tableau (at least 4 yrs. of demonstrable experience)
  • Experience of various java libraries for front end development and embedding of visuals (i.e. D3, react, node, etc.)

Tasks (what does the role do on a day-to-day basis)

  • Engage with external clients, vendors, project reps, internal stakeholders from operations as well as client services teams to understand their analytics and dashboard requirements
  • Maintain and enhance existing database architecture of the BI solution
  • Conduct in-depth exploratory data analysis and define business critical features
  • Work with key teams within Group Technology to get appropriate access and infrastructure setup to develop advanced visualisations (BI dashboards)
  • Optimize SQL server queries, stored procedures, views to efficiently and effectively retrieve data from databases, employing the fundamentals of dimensional data modelling
  • Ensure updates to tickets and work in progress are well communicated and escalations regarding the delivery being provided is kept to a minimum
  • Maintain and enhance existing data solutions. Maintain best-practice data warehouse solutions that support business analytics needs. This is achieved using on-prem or cloud databases and different ETL / ELT programs and software, such as Azure Data Factory, SSIS, Python, PowerShell, Alteryx or other open source technology
  • Create, maintain and document the different analytics solution processes created per project worked on
  • Resolve IT and data issues. When database issues arise or development requests come in through the help desk, BI Developers work to resolve these problems. This requires an understanding of legacy solutions and issues.

Key competencies for position and level

  • Analytical Reasoning – Ability to identify patterns within a group of facts or rules and use those patterns to determine outcomes about what could / must be true and come to logical conclusions
  • Critical Thinking – Ability to conceptualise, analyse, synthesise, evaluate and apply information to reach an answer or conclusion
  • Conceptual Thinking and Creative Problem Solving - Original thinker that has the ability to go beyond traditional approaches with the resourcefulness to adapt to new / difficult situations and devise ways to overcome obstacles in a persistent manner that does not give up easily
  • Interpersonal Savvy – Relating comfortably with people across all levels, functions, cultures & geographies, building rapport in an open, friendly & accepting way
  • Effective Communication – Adjusting communication style to fit the audience & message, whilst providing timely information to help others across the organisation.  Encourages the open expression of diverse ideas and opinions
  • Results / Action Orientated and Determination – Readily taking action on challenges without unnecessary planning and identifies new opportunities, taking ownership of them with a focus on getting the problem solved

Key behaviours we expect to see

In addition to demonstrating our Group Values (Authentic, Bold, and Collaborative), the role holder will be expected to demonstrate the following:

  • Facilitate open and frank debate to drive forward improvement
  • Willingness to learn, develop, and keep abreast of technological developments
  • An analytical mind, excellent problem-solving & diagnostic skills, attention to detail

Qualifications

Required Experience

Education / professional qualifications

  • Degree level education in Data Analytics or Computer Science is preferred but equivalent professional IT certification is acceptable.

Background experience

  • A Minimum of 5 years’ experience in a developer / engineer role or similar DB experience.
  • Good understanding of dimensional data modelling methodologies
  • Experience with visualization and reporting tools namely Tableau and Power BI as well as Qlikview, Looker, ThoughtSpot
  • Experience with Microsoft Fabric Platform
  • Experience with MS Excel including PowerPivot

Technical

  • Experience of supporting a variety of SQL based applications.
  • Hands on experience with SQL 2016 and above
  • Experience with T-SQL and the ability to analyse queries for efficiency.
  • Experience with MS SQL Server suite, including SSIS
  • Experience in Fabric Data Factory, Azure Data Factory, Azure Synapse
  • Experience in both batch (incremental and full load) and near-real time data ETL / ELT data processing
  • Experience with version control software e.g. Git, Bitbucket as well as software development platforms such Azure DevOps and Jira

Languages

  • English

 

Apply now Apply later

* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰

Job stats:  1  0  0

Tags: Airflow Architecture AWS Azure Bitbucket Business Analytics Computer Science D3 Data analysis Data Analytics Databricks Data warehouse dbt DevOps EDA ELT ETL Excel GCP Git Java Jira Looker Matplotlib MS SQL NumPy Open Source Plotly Power BI Python QlikView React Scikit-learn Spark SQL SSIS Tableau T-SQL Unstructured data

Perks/benefits: Career development

Region: Asia/Pacific
Country: India

More jobs like this