Back to Jobs

Sr. AWS Infrastructure and Data Engineer – Remote

Remote, USA Full-time Posted 2025-11-24
Overview We're looking for a highly skilled Sr. AWS Infrastructure and Data Engineer to join our team. This individual will play a crucial role in designing, building, and maintaining our cloud-based data infrastructure and pipelines on the AWS platform. You'll be responsible for ensuring our data systems are scalable, secure, and efficient, enabling our analytics and business intelligence teams to extract valuable insights. The ideal candidate has deep expertise in AWS services related to both infrastructure and data, a strong understanding of data engineering principles, and a passion for automation and best practices. Responsibilities • Design and architect scalable and reliable data infrastructure on AWS, using services such as EC2, S3, RDS, Redshift, and Lambda. • Build and maintain robust ETL/ELT data pipelines using AWS Glue, DataSync, and other relevant services to ingest, transform, and load data from various sources. • Implement and manage data lakes and data warehouses, ensuring data quality, governance, and security. • Develop and deploy infrastructure as code (IaC) using tools like Terraform or CloudFormation to automate the provisioning and management of AWS resources. • Monitor and optimize the performance, cost, and security of our AWS data environment. • Collaborate with data scientists, analysts, and other engineers to understand their data needs and provide effective solutions. • Troubleshoot and resolve issues related to data pipelines, infrastructure, and performance. • Stay up-to-date with the latest AWS services and industry best practices. Requirements • years of experience in a similar role with a strong focus on AWS and data engineering. • Proven expertise in core AWS services, including but not limited to S3, EC2, Lambda, RDS, Redshift, Glue, and Step Functions. • Strong proficiency in at least one scripting language (e.g., Python). • Extensive experience with Infrastructure as Code (IaC) tools, particularly Terraform. • Solid understanding of database concepts, data warehousing, and ETL/ELT processes. • Familiarity with version control systems (Git). • Excellent problem-solving skills and the ability to work independently and as part of a team. • Strong communication skills to effectively collaborate with technical and non-technical stakeholders. Nice to have • AWS Professional certifications (e.g., AWS Certified Data Analytics - Specialty, AWS Certified Solutions Architect - Professional). • Experience with big data technologies like Spark, Hadoop, or Snowflake. • Familiarity with CI/CD pipelines (e.g., Jenkins, GitHub Actions). • Knowledge of containerization technologies (e.g., Docker, Kubernetes). • Experience with stream processing technologies (e.g., Kinesis, Kafka). -- Apply tot his job Apply To this Job

Similar Jobs