FreshRemote.Work

Data Solutions Architect - AI

Remote, United States

Overview

The Data Solutions Architect provides architectural leadership and guidance in their assigned business area, aligning solution development efforts with the broader architectural vision and roadmap. A Technical Architect’s skills and knowledge must include multiple domains (Application/Solution architecture, Technical/Infrastructure architecture, Information/Data architecture), and experience with relevant implementation technology, platforms, and tools. As a leader, a Technical Architect is expected to influence the assigned organization and work independently with senior business and IT leaders, while acting as a mentor and role model within the development teams. The Technical Architect is accountable for successful implementation of architecture in their assigned organization/teams, which requires excellent communication skills, the ability to work with developers of differing skill levels, and the ability to recognize, surface, and resolve architectural issues in a collaborative manner.

Responsibilities

  • Responsible for designing and leading a robust, scalable, and secure end2end data architecture that support our business objectives and enable data-driven decision-making 
  • Collaborate with stakeholders across the organization to define and articulate data strategy and objectives aligned with business goals 
  • Enable data governance and stewardship, data governance frameworks, data quality controls as well as monitoring mechanisms ensuring alignment on data policies and standards. 
  • Design and develop data models, schemas, and data warehouse architectures that support business intelligence, analytics, and reporting requirements 
  • Architect efficient data integration pipelines  
  • Ensure data lineage and traceability across organization's data ecosystem 
  • Implement data security measures, access controls, and authentication mechanisms to protect sensitive data and ensure compliance 
  • Implement data retention, archiving, disaster recovery, and business continuity policies and plans 
  • Monitor data quality, performance bottlenecks, and data flows for efficient and reliable data delivery 
  • Complete all responsibilities as outlined on annual Performance Plan.
  • Complete all special projects and other duties as assigned.
  • Must be able to perform duties with or without reasonable accommodation.

Qualifications

  • BA / preferably a Master’s degree in computer science, engineering, mathematics or equivalent work experience 
  • Experience communicating across technical and non-technical audiences, including training, workshops, publications is a plus 
  • 7+ years of technical specialist, design and architecture experience 
  • 5+ years of database (e.g. SQL, NoSQL, Hadoop, Spark, Kafka, Kinesis) experience 
  • 5+ years of cloud-based solution (AWS/Azure/GCP), system, network and operating system experience 
  • Experience designing and architecting distributed data systems 
  • Solid programming skills in Python, Scala, Java, SQL 
  • Experience with Data Engineering technologies (e.g., Spark, Hadoop, Kafka) 
  • Experience with Data Warehousing (e.g., SQL, OLTP/OLAP/DSS), Data Science and Machine Learning technologies (including GenAI) 
  • Nice to have: Experience with Databricks  

 

Mental Requirements:

  • Communicating with others to exchange information.
  • Assessing the accuracy, neatness, and thoroughness of the work assigned.

Physical Requirements and Working Conditions:

  • Remaining in a stationary position, often standing or sitting for prolonged periods.
  • Repeating motions that may include the wrists, hands, and/or fingers.
  • Must be able to provide a dedicated, secure work area.
  • Must be able to provide high-speed internet access/connectivity and office setup and maintenance.
  • No adverse environmental conditions expected.

Base compensation ranges from $120K to $160K. Specific offers are determined by various factors, such as experience, education, skills, certifications, and other business needs. This role is eligible for discretionary bonus consideration.

 

Cotiviti offers team members a competitive benefits package to address a wide range of personal and family needs, including medical, dental, vision, disability, and life insurance coverage, 401(k) savings plans, paid family leave, 9 paid holidays per year, and 17-27 days of Paid Time Off (PTO) per year, depending on specific level and length of service with Cotiviti. For information about our benefits package, please refer to our Careers page.

 

This role is based remotely and all interviews will be conducted virtually.

 

Date of posting: 8/29/2024

Applications are assessed on a rolling basis. We anticipate that the application window will close on 6/29/2024, but the application window may change depending on the volume of applications received or close immediately if a qualified candidate is selected.

 

#LI-Remote

#LI-LC1

 

Apply

Job Profile

Regions

North America

Countries

United States

Restrictions

Must be able to perform duties with or without reasonable accommodation Must be able to provide a dedicated, secure work area Must be able to provide high-speed internet access

Benefits/Perks

9 paid holidays per year Competitive benefits package Dental Disability Discretionary bonus Flexible hours Life Insurance Life insurance coverage Medical Medical, dental, vision, disability, and life insurance coverage Paid Family Leave Paid holidays Paid Time Off Professional development Remote work Vision

Tasks
  • Architect data integration pipelines
  • Collaborate on data strategy
  • Collaborate with stakeholders
  • Complete all responsibilities as outlined
  • Compliance
  • Data integration
  • Design data architecture
  • Develop data models
  • Ensure compliance
  • Ensure data security
  • Implement data governance
  • Monitor data quality
  • Other duties as assigned
  • Reporting
  • Special projects
  • Training
Skills

Access AI Analytics Application Architecture AWS Azure Business Intelligence Cloud solutions Communication Compliance Computer Computer Science Data Architecture Data engineering Data Governance Data Integration Data Modeling Data Pipelines Data Quality Data Science Data Security Data Warehouse Data Warehousing Development Disaster Recovery Exchange GCP Governance Frameworks Hadoop Infrastructure Architecture IT Java Kafka Kinesis Leadership Machine Learning Mathematics NoSQL Python Reporting Scala Science Security Solution Architecture Spark SQL Teams Technical architecture Technology Training

Experience

7 years

Education

B.A. Business Computer Science Data Science Engineering Equivalent Equivalent work experience Master's Master’s Degree in Computer Science Mathematics Medical Senior

Timezones

America/Anchorage America/Chicago America/Denver America/Los_Angeles America/New_York Pacific/Honolulu UTC-10 UTC-5 UTC-6 UTC-7 UTC-8 UTC-9