AWS Data Engineer
Overland Park, KS, US
Location Designation: Hybrid - 3 days per week
This position primarily involves a hybrid work schedule - working remotely: Monday & Friday / on site: Tuesday, Wednesday and Thursday.
Within Institutional Life at New York Life, you'll join a team dedicated to providing tailored life insurance solutions for businesses and organizations of all sizes. From corporate-owned life insurance (COLI) strategies to bank-owned life insurance (BOLI) programs, your work contributes to the financial well-being of countless institutions and their stakeholders. By designing customized life insurance solutions, you will empower institutions to invest in their long-term goals with confidence, fostering stability and growth for generations to come.
Role Overview:
We are seeking a skilled and motivated AWS Data Analyst/Engineer to join our team and play a pivotal role in developing, managing, and analyzing large-scale data solutions on AWS. This role blends expertise in data engineering, analytics, and cloud technologies, empowering the organization with insights to drive business decisions. The ideal candidate will have hands-on experience with AWS services, data pipelines, and visualization tools, along with a strong analytical mindset.
What You’ll Do:
1. Data Engineering and Pipeline Development
- Design, build, and maintain scalable ETL/ELT data pipelines using AWS services such as AWS Glue, Lambda, Step Functions, and Data Pipeline.
- Automate data ingestion and transformation from various sources, including S3, Redshift, RDS, DynamoDB, and on-premises systems.
- Implement and optimize data workflows for structured, semi-structured, and unstructured data.
2. Data Analytics and Insights
- Work closely with stakeholders to define key metrics and generate actionable insights.
- Build and optimize queries in Athena, Redshift, and other AWS services to analyze large datasets efficiently.
- Design, implement, and maintain dashboards and reports using tools like QuickSight and Tableau.
3. Data Governance and Quality
- Implement data governance policies, ensuring data security, compliance, and privacy (e.g., GDPR, CCPA).
- Develop and enforce best practices for data validation, error handling, and data lineage using tools like Lake Formation or AWS Glue Data Catalog.
- Monitor and improve data quality across multiple pipelines and systems.
4. Cloud Infrastructure and Optimization
- Optimize data storage and retrieval strategies for cost efficiency and scalability using AWS storage options like S3, Glacier, and Redshift Spectrum.
- Manage AWS infrastructure to …
This job isn't fresh anymore!
Search Fresh JobsJob Profile
Hybrid Hybrid - 3 days per week Hybrid work Hybrid work schedule
Benefits/PerksAdoption Assistance Annual discretionary bonus Benefits Collaboration Discretionary bonus Discretionary bonus eligible Employee giving Financial Security Hybrid work Hybrid work schedule Incentive Program Leave programs Overtime eligible Pay Transparency Sales bonus Sales bonus eligible Student Loan Repayment Student loan repayment programs Volunteerism
Tasks- Analyze large datasets
- Collaboration
- Compliance
- Design and maintain data pipelines
- Implement data governance
- Mentor junior team members
- Optimization
Analytical Analytics Apache Spark Athena AWS AWS Glue AWS Lambda AWS Step Functions CCPA Cloud Cloud Infrastructure Cloud Technologies CloudWatch Collaboration Compliance Data engineering Data Governance Data Pipeline Data Pipelines Data processing Data Quality Data Validation Design DynamoDB ELT Engineering ETL GDPR Governance Insurance Insurance Solutions IT Leadership Life Insurance Mentorship Organization Python Quicksight RDS Redshift S3 Sales Security Spark SQL Tableau Technical Terraform
Education TimezonesAmerica/Anchorage America/Chicago America/Denver America/Los_Angeles America/New_York Pacific/Honolulu UTC-10 UTC-5 UTC-6 UTC-7 UTC-8 UTC-9