Consultant, Data Analytics & Reporting - Ag & Trading
Bangalore, Karnataka, IN India, 560087
Job Purpose and Impact
- The Professional, Data Engineering job designs, builds and maintains moderately complex data systems that enable data analysis and reporting. With limited supervision, this job collaborates to ensure that large sets of data are efficiently processed and made accessible for decision making.
Key Accountabilities
- DATA & ANALYTICAL SOLUTIONS: Develops moderately complex data products and solutions using advanced data engineering and cloud based technologies, ensuring they are designed and built to be scalable, sustainable and robust.
- DATA PIPELINES: Maintains and supports the development of streaming and batch data pipelines that facilitate the seamless ingestion of data from various data sources, transform the data into information and move to data stores like data lake, data warehouse and others.
- DATA SYSTEMS: Reviews existing data systems and architectures to implement the identified areas for improvement and optimization.
- DATA INFRASTRUCTURE: Helps prepare data infrastructure to support the efficient storage and retrieval of data.
- DATA FORMATS: Implements appropriate data formats to improve data usability and accessibility across the organization.
- STAKEHOLDER MANAGEMENT: Partners with multi-functional data and advanced analytic teams to collect requirements and ensure that data solutions meet the functional and non-functional needs of various partners.
- DATA FRAMEWORKS: Builds moderately complex prototypes to test new concepts and implements data engineering frameworks and architectures to support the improvement of data processing capabilities and advanced analytics initiatives.
- AUTOMATED DEPLOYMENT PIPELINES: Implements automated deployment pipelines to support improving efficiency of code deployments with fit for purpose governance.
- DATA MODELING: Performs moderately complex data modeling aligned with the datastore technology to ensure sustainable performance and accessibility.
Qualifications
- Minimum requirement of 2 years of relevant work experience in Python & SQL
- Good to have: Knowledge of AWS ( Lamba/Glue) functions
- At least 1 year experience in Snowflake
- Bachelor's / Masters degree in Computer Science or relevant field with minimum 2 years of relevant work experience
* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰
Job stats:
0
0
0
Categories:
Analyst Jobs
Consulting Jobs
Tags: Architecture AWS Computer Science Data analysis Data Analytics Data pipelines Data warehouse Engineering Pipelines Python Snowflake SQL Streaming
Region:
Asia/Pacific
Country:
India
More jobs like this
Explore more career opportunities
Find even more open roles below ordered by popularity of job title or skills/products/technologies used.
Senior AI Engineer jobsSr. Data Engineer jobsData Engineer II jobsBI Developer jobsPrincipal Data Engineer jobsStaff Data Scientist jobsStaff Machine Learning Engineer jobsData Manager jobsData Science Manager jobsData Science Intern jobsPrincipal Software Engineer jobsBusiness Data Analyst jobsJunior Data Analyst jobsData Specialist jobsResearch Scientist jobsSoftware Engineer II jobsData Analyst Intern jobsLead Data Analyst jobsDevOps Engineer jobsSr. Data Scientist jobsAI/ML Engineer jobsData Engineer III jobsJunior Data Engineer jobsStaff Software Engineer jobsData Engineering Manager jobs
Git jobsEconomics jobsLinux jobsOpen Source jobsAirflow jobsKafka jobsPhysics jobsNoSQL jobsData Warehousing jobsHadoop jobsComputer Vision jobsJavaScript jobsGoogle Cloud jobsMLOps jobsKPIs jobsRDBMS jobsPostgreSQL jobsScala jobsScikit-learn jobsTerraform jobsBanking jobsGitHub jobsData warehouse jobsClassification jobsStreaming jobs
R&D jobsPandas jobsOracle jobsDistributed Systems jobsPySpark jobsBigQuery jobsScrum jobsCX jobsSAS jobsData Mining jobsReact jobsJira jobsRAG jobsLooker jobsMicroservices jobsdbt jobsRobotics jobsRedshift jobsIndustrial jobsMySQL jobsUnstructured data jobsJenkins jobsNumPy jobsE-commerce jobsGPT jobs