The Infrastructure and Automation Engineer Intern will support the Property Claims Optimization & Operational Design team as we modernize data platforms and automation across Microsoft Azure, Microsoft Fabric, and high-performance on-prem Linux environments.
This internship is designed for students or recent graduates pursuing education in computer science, engineering, data, cloud, or related technical fields who are eager to gain hands-on experience working on real, production-level systems. The intern will learn how modern cloud infrastructure, automation, and data pipelines are designed, deployed, and supported in an enterprise environment.
No prior professional experience is required — curiosity, willingness to learn, and foundational technical knowledge are key.
What You’ll Learn & Work On
Cloud & Platform Fundamentals (Azure + Microsoft Fabric)
- Learn how Azure cloud resources are designed and managed, including:
- Azure Data Lake Storage (ADLS Gen2)
- Event Hub
- Azure Function Apps
- Azure SQL Managed Instances
- Gain exposure to Infrastructure as Code concepts using Terraform and GitHub-based workflows.
- Assist with Microsoft Fabric workloads, including:
- Lakehouse ingestion
- Fabric pipelines
- OneLake shortcuts
- Introductory real-time data streaming patterns
- Observe and support secure cloud-to-cloud vendor integrations using modern authentication and automation techniques.
On-Prem & Automation Foundations (Linux + Python)
- Learn how a high-performance Linux environment is used to support automation and data processing.
- Assist in developing and supporting Python-based automation and ETL scripts, including:
- File ingestion and transformation
- API data retrieval
- Scheduled job execution
- Gain exposure to workflow orchestration tools (such as Airflow) and automation best practices.
- Learn basic monitoring, logging, and troubleshooting techniques for data services.
Data Engineering & Analytics Exposure
- Support data engineering efforts that power:
- Operational dashboards
- Automated reporting
- Real-time insights for Property Claims operations
- Learn how data flows across systems such as:
- Oracle
- SQL Server
- Snowflake
- Cloud storage platforms
- Collaborate with experienced engineers, analysts, and architects while contributing to real project deliverables.
Who This Internship Is For
This role is ideal for someone who:
- Is pursuing or has recently completed a degree in:
- Computer Science
- Data Engineering
- Information Systems
- Software Engineering
- Cloud / IT-related disciplines
- Enjoys problem-solving and learning how systems work end-to-end
- Is curious about cloud platforms, automation, and modern data architecture
- Thrives in a collaborative, fast-moving technical environment
Foundational Qualifications
- Coursework or academic experience in one or more of the following:
- Python or another programming language
- SQL or database concepts
- Cloud computing fundamentals
- Linux or operating systems concepts
- Familiarity (academic or personal projects) with:
- APIs
- Data processing
- Automation or scripting
- Strong desire to learn, ask questions, and grow technical skills
Nice to Have (Not Required)
- Exposure to any of the following through coursework or projects:
- Microsoft Azure
- Terraform or Infrastructure as Code concepts
- Microsoft Fabric or data platforms
- GitHub or version control
- Event streaming concepts (Event Hub, Kafka)
- ETL or data pipelines