Data Engineering Intern (Fall)
Berkeley, CA
ā ļø We'll shut down after Aug 1st - try fooš¦ for all jobs in tech ā ļø
Aircapture
We supply commercial and industrial customers with clean CO2 captured from our atmosphere to radically improve the environment, the economy, and our lives.At Aircapture weāre creating and scaling a circular carbon economy to solve what we believe to be our lifetime's most pressing challenge: the climate crisis. We supply commercial and industrial customers with clean CO2 captured from our atmosphere to radically improve the environment, our economy, and our lives. We value building a team of people who represent diverse backgrounds--be it thought, education, gender, ethnicity, age, sexual orientation--to reach our goals. Thank you for considering us.
As a part time Fall Data Engineering Intern you will play a key role in creating new data architectures and monitoring, optimizing, and automating existing data pipelines and models. You will work alongside our data analysis, test engineering and electrical engineering teams to improve Aircaptureās data processing speed, efficiency, and cost. If you are excited to mitigate the impact of climate change at a groundbreaking climate technology startup, this is the role for you!
Pay Rate: $25 per hour
What Youāll Do Here
- This role is onsite at our Berkeley headquarters either approximately 10+ hours per weekāyour work hours can be adjusted to fit your class schedule
- Partner with the test engineering and data analysis teams to understand Aircaptureās data requirements and translate business needs into technical solutions
- Build and maintain scalable data pipelines to extract, transform, and load (ETL) operational data from Aircaptureās projects and machinesĀ
- Develop and maintain dashboards, reports, and visualizations to communicate key insights and contribute to ongoing data-driven initiatives
- Serve as a point of contact for internal IT needs, working closely with external support partners when needed
- Set up and document IT scripts and services supporting core data systems, ensuring smooth operations and reliable data workflows
- Extract, transform, and load sensor and instrument data into databases to create analytics that inform decision making
Your Skills and Abilities Include
- An open mind and curiosity to try new thingsāyouāre ready to share your perspective with thoughtfulness and conviction
- Currently entering senior year or equivalent in Computer Science, Data Science, or a related field, with at least one completed course, internship, or hands-on project in data engineering; or recently complete bachelor's degree
- Proficiency in Python (pandas/numpy) + SQL (Pyspark a plus)
- Experience working in any cloud a plus (we use AWS)
- Basic understanding of data science concepts (data cleaning, feature engineering, regression, machine learning principles)
- Knowledge of Apache software (such as Spark, Airflow, and Kafka), Linux, and Docker a plusāif you donāt know them, you can start learning them here!
Aircapture strives to create a safe, inclusive, equitable and diverse workplace. Every teammate adds to who we are, diversifying our ideas, experiences and viewpoints and makes us stronger. We hope you feel welcome here.
Tags: Airflow Architecture AWS Computer Science Data analysis Data pipelines Docker Engineering ETL Feature engineering Industrial Kafka Linux Machine Learning NumPy Pandas Pipelines PySpark Python Spark SQL
Perks/benefits: Startup environment
More jobs like this
Explore more career opportunities
Find even more open roles below ordered by popularity of job title or skills/products/technologies used.