Overview
DevOps Data Engineer – Hybrid (Chicago, IL)
Position Overview
We are seeking a highly capable DevOps Data Engineer to support a large-scale enterprise data initiative. This role focuses on optimizing cloud-based infrastructure, automating deployments, and supporting advanced analytics capabilities in a fast-paced environment. The ideal candidate will possess a strong command of AWS, Terraform, Python, and CI/CD pipelines, with the ability to troubleshoot complex data issues and deliver scalable, production-grade solutions.
Key Responsibilities
Support and optimize large-scale data pipelines in distributed cloud environments.
Configure and maintain production AWS environments for data infrastructure and analytics frameworks.
Develop and manage Terraform modules to automate infrastructure deployments.
Collaborate with cross-functional teams to implement robust data acquisition, transformation, and storage workflows.
Work with Databricks and other data lake technologies to enable analytics and machine learning workflows.
Design and deliver batch and streaming data applications using AWS services.
Lead troubleshooting and root cause analysis of issues in production systems.
Provide hands-on technical leadership in DevOps processes, tooling, and deployment best practices.
Participate in agile development cycles with product owners, project managers, and engineering leads.
Guide cross-functional teams through implementation, validation, and delivery of technical solutions.
Required Skills & Qualifications
Bachelor’s degree in Software Engineering, Statistics, Information Systems, or a related field.
5+ years of experience in data engineering, DevOps, or ETL development roles.
Advanced proficiency in AWS cloud services (Redshift, S3, Glue, Lambda, Athena, etc.).
Expertise in Python, Bash scripting, PySpark, and Terraform.
Strong experience building and maintaining CI/CD pipelines.
Working knowledge of SQL and database tools (e.g., SSMS, Oracle SQL Developer, BigQuery).
Ability to work with structured, semi-structured, and unstructured datasets.
Excellent communication and analytical skills with the ability to clearly explain technical solutions.
Strong initiative and ownership of projects from start to finish.
Preferred Skills & Certifications
AWS Certification (e.g., Solutions Architect).
Experience with Kubernetes, Blue-Green deployments, or Ansible Playbooks.
Master’s degree in a related field.
Familiarity with Quantum Metrics, Akamai.
Experience in Agile environments and CI/CD delivery pipelines.
Background in transportation or logistics data systems.
Work Schedule & Location
- Work Model: Onsite
- Onsite Requirement: 3–4 days per week
Location: Downtown Chicago, IL
Work Hours: Monday – Friday, 9:00 AM – 5:00 PM CST
Contract Details
Contract Length: 10 months
Employment Type: Contract
Rate Range: $90+ per hour
If you are passionate about leveraging cloud technologies to build modern data platforms and thrive in high-ownership, collaborative environments, we invite you to apply.
< Back To All Jobs