Unfortunately, this job posting is expired.
Don't worry, we can still help! Below, please find related information to help you with your job search.
Some similar recruitments
Etl Tester / Data Quality Analyst//Local To Mi Only
Recruited by Evolutyz Corp 9 months ago Address Ann Arbor, MI, United States
Etl Engineer Jobs
Recruited by JRD Systems 1 year ago Address Michigan, United States

Data Engineer -Etl Jobs

Company

Optimal Staffing

Address Dearborn, MI, United States
Employment type FULL_TIME
Salary
Category Staffing and Recruiting
Expires 2023-08-13
Posted at 10 months ago
Job Description
Position Description


As an MLOps Engineer at the Company, you will be an integral part of the Quality Analytics Team within the GDIA organization. Your role will involve collaborating with the modeling team to transition new models from proof of concept to production. Additionally, you will be responsible for establishing a robust back-end infrastructure to deploy our machine learning models in real-time streaming contexts. We are seeking a talented individual who possesses a combination of technical skills, industry experience, and strong communication abilities.


Responsibilities


  • Collaborate with cross-functional teams to ensure the successful integration of ML models into production systems.
  • Design and develop ETL pipelines to ensure seamless data integration and processing for model training and inference.
  • Create a back-end infrastructure that supports the deployment of our machine learning models in real-time streaming contexts.
  • Collaborate with the modeling team to facilitate the smooth transition of new models from proof of concept to production, ensuring scalability, reliability, and efficiency.
  • Work closely with the data engineering team to optimize and streamline data pipelines and workflows.
  • Stay updated with the latest advancements in MLOps and implement best practices to enhance model deployment and monitoring.


Skills Required


1+ year of experience working with Google Cloud Platform (GCP) services, leveraging its capabilities for ML model deployment.


2+ years of experience in Python programming, including libraries such as TensorFlow, PyTorch, or scikit-learn.


2+ years of experience in Java programming, including knowledge of frameworks such as Apache Kafka.


Experience Required


Experience working with Kubernetes in an industry context, managing containerized applications and orchestrating deployments. Proficiency in writing and maintaining ETL pipelines, extracting data from various sources and transforming it for model training and inference. Familiarity with Apache Beam for building data processing pipelines.


Experience Preferred


Previous experience working in a large, data-driven organization, with exposure to complex analytics workflows and data systems.


Education Required


Bachelor's degree in computer science or a related field.


Education Preferred


Master's degree in computer science or a related field.


Additional Information


Strong communication skills, with the ability to effectively collaborate with cross-functional teams and stakeholders.