Data Engineering Intern (Remote in North Carolina)
North Carolina
About your role
As a Data Engineering Intern, you will work with the Business Intelligence team to help extract, transform, and load (ETL) data from both internal and external sources into a Snowflake data warehouse. You’ll gain hands-on experience using tools such as SQL, Airflow, and Google Cloud to orchestrate data flows and support data-driven decision-making across the organization. You’ll collaborate closely with a diverse team of report developers, product owners, and QA staff to help ensure smooth and accurate data operations, and you'll get to experience working in an agile environment.
Reports to: Principal Data Engineering Architect
Location: Remote, but you must be located in CA, AZ, CO, NC, or UT during the internship
Duration: June 16, 2025 - August 18, 2025, working 40 hours per week
How you will make a difference day-to-day
- Data Integration: Analyze, extract, transform, and load data from multiple internal and external sources into a Snowflake warehouse to support data analysis and business intelligence efforts.
- SQL & API Calls: Use SQL and API calls to extract data, and then transform and load it using SQL-based processes.
- Orchestrate Data Flows: Use Airflow in Google Cloud to manage and orchestrate data flows, ensuring seamless data pipeline operations.
- Team Collaboration: Participate in daily team huddles, working alongside other engineers, report developers, and product teams to ensure alignment and progress.
- Collaboration with Report Developers: Work closely with report developers to create useful target tables and views, supporting various business intelligence reports.
- Collaboration with Product & QA Teams: Partner with product owners, release managers, and QA staff to validate and promote code to production.
- Agile Workflow: Update a JIRA Kanban board during certain weeks and a sprint board in others, helping to manage and track progress in an agile environment.
What you’ll need
- Senior Undergraduate Student pursuing a Bachelor’s degree in Computer Science or a related field.
- Proficiency in SQL, Python, Git, and Linux.
- Ability to work collaboratively in a fast-paced, team-oriented environment.
- Familiarity with Google Cloud Platform, Airflow, and Snowflake.
Interview Process:
- Recruiter Phone Screen
- Role Assessment(s)
- Hiring Manager Interview
- Utah, Arizona, North Carolina: $29.60
- California (Outside of San Francisco Bay area) and Colorado: $31.45
By applying for this position, your data will be processed as per Rocket Lawyer Privacy Policy.
ApplyJob Profile
AZ CO Must be located in CA Must be located in CA, AZ, CO, NC, or UT NC UT
Benefits/PerksAgile workflow Diverse team environment Hands-on experience Remote work
Tasks- Collaboration
- Data flow orchestration
- ETL data processing
- Report development
- Team collaboration
Agile Airflow Cloud Compensation Computer Science Data analysis Data engineering Git Google Cloud Google Cloud Platform Jira Linux Organization Python QA Snowflake SQL Team Collaboration
Experience0 years
EducationBachelor's Business Computer Science Engineering Related Field Undergraduate
TimezonesAmerica/Anchorage America/Chicago America/Denver America/Los_Angeles America/New_York Pacific/Honolulu UTC-10 UTC-5 UTC-6 UTC-7 UTC-8 UTC-9