AWS MLOps Engineer - REMOTE
United States
We currently have a career opportunity for an AWS MLOps Engineer to join our team. While our headquarters location is St. Louis, MO, this position is remote and can be based anywhere within the continental United States.
Job Overview:
Perficient is seeking an experienced AWS MLOps Engineer to design, implement, and maintain robust data pipelines and AI/ML deployment workflows. The ideal candidate will have a strong background in Python, AWS SageMaker, Bedrock, and AI/ML deployment, along with hands-on experience in various DevOps tools and practices.
Perficient is always looking for the best and brightest talent and we need you! We’re a quickly-growing, global digital consulting leader, and we’re transforming the world’s largest enterprises and biggest brands. You’ll work with the latest technologies, expand your skills, and become a part of our global community of talented, diverse, and knowledgeable colleagues.
- Design, build, and manage data pipelines using AWS services including AWS Lakehouse, Amazon Managed Workflows for Apache Airflow (MWAA), and Airflow.
- Deploy and maintain AI/ML models using AWS SageMaker and Bedrock.
- Develop and implement CI/CD workflows using GitHub Actions, Terraform Cloud, and JFrog Artifactory.
- Collaborate with data scientists and ML engineers to integrate and deploy ML models.
- Manage the end-to-end SDLC of ML models, ensuring best practices in version control, testing, and deployment.
- Optimize data workflows for performance, scalability, and reliability.
- Troubleshoot and resolve issues related to data pipelines and ML deployments.
- Bachelor’s degree in computer science, Engineering, or a related field.
- 3-4 years of experience in AI/ML and data pipeline engineering.
- Proficiency in Python and experience with AI/ML deployment.
- Strong experience with AWS services, including SageMaker, Bedrock, and AWS Lakehouse.
- Hands-on experience with MWAA and Airflow for workflow management.
- Familiarity with DevOps tools such as GitHub Actions, Terraform Cloud, and JFrog Artifactory.
- Understanding of the SDLC for ML models and best practices for deployment.
- Strong problem-solving skills and ability to work in a fast-paced environment.
- Excellent communication and collaboration skills.
Preferred Qualifications:
- AWS certification (e.g., AWS Certified Machine Learning, AWS Certified Solutions Architect).
- Experience with additional ML frameworks and libraries.
- Familiarity with data engineering best practices and tools.
The salary range for this position takes into consideration a variety of factors, including but not limited …
This job isn't fresh anymore!
Search Fresh JobsJob Profile
Expand Experience work-life balance Global community Health insurance Professional development Remote work flexibility Retirement plans
Tasks- Best Practices
- Collaborate with data scientists
- Collaboration
- Deployment
- Testing
AI/ML Apache Airflow AWS AWS SageMaker AWS services Best Practices CI/CD Cloud Collaboration Communication Consulting Content Data engineering Deployment Design DevOps Digital Transformation Engineering GitHub Actions Machine Learning Problem-solving Python SDLC Technology Terraform Testing Training Workflow
Experience3-4 years
EducationBachelor's degree Computer Science Engineering Related Field
CertificationsAWS Certified Solutions Architect
TimezonesAmerica/Anchorage America/Chicago America/Denver America/Los_Angeles America/New_York Pacific/Honolulu UTC-10 UTC-5 UTC-6 UTC-7 UTC-8 UTC-9