Unfortunately, this job posting is expired.
Don't worry, we can still help! Below, please find related information to help you with your job search.
Some similar recruitments
Entry-Level Python Django Developer
Recruited by Pattern Learning AI - Career & Tech Recruitment Reimagined! 8 months ago Address Los Angeles, CA, United States
Python Developer (Entry Level)
Recruited by TEKtalent-Inc 8 months ago Address San Francisco Bay Area, United States
Python Software Engineer, Devops
Recruited by Apple 8 months ago Address Cupertino, CA, United States
Python Developer Jobs
Recruited by Cognizant 8 months ago Address Alameda, CA, United States
Senior Python Developer Jobs
Recruited by Eleven Recruiting 9 months ago Address Pasadena, CA, United States
Sr Python Developer Jobs
Recruited by Adbakx 9 months ago Address Oakland, CA, United States
Python Automation Engineer Jobs
Recruited by GNV IT Solutions 9 months ago Address Mountain View, CA, United States
Python Developer Jobs
Recruited by American Unit, Inc 9 months ago Address Mountain View, CA, United States
Aws Developer Jobs
Recruited by élos Healthcare 9 months ago Address San Francisco Bay Area, United States
Python Developer (Entry Level)
Recruited by TEKtalent-Inc 9 months ago Address San Francisco County, CA, United States
Python Developer Jobs
Recruited by Persistent Systems 9 months ago Address Santa Clara County, CA, United States
Python Software Developer Jobs
Recruited by TEKenergy llc. 9 months ago Address California City, CA, United States
Python Developer Jobs
Recruited by SPECTRAFORCE 10 months ago Address Santa Ana, CA, United States

Python/Devops Developer Jobs

Company

ANC Inc

Address San Francisco, CA, United States
Employment type CONTRACTOR
Salary
Expires 2023-08-08
Posted at 10 months ago
Job Description

Job role:-DevOps/Python developer

Location:-San Francisco, CA orCupertino,CA (Hybrid)

Type:-Contract

Job description:-

• Can you please provide a summary of the project/initiatives which describes what’s being done?

1. We will develop an AI/Client Model Inferencing Pipeline that would automate the extraction of all data elements from the Document or from Source Streaming Data, this will leverage the elastic nature of cloud for cost optimize for different use cases.


• What are the top 5-10 responsibilities for this position? (Please be detailed as to what the candidate is expected to do or complete on a daily basis)

1. You will design, develop, test, deploy, maintain, and enhance Machine Learning Pipelines using K8s/AKS based Argo Workflow Orchestration solutions.

2. Participate and contribute to design reviews with platform engineering team to decide the design, technologies, project priorities, deadlines, and deliverables.

3. You will work closely with Data Lake and Data Science team to understand their data structure and machine learning algorithms.

4. Understanding of ETL pipelines, and ingress / egress methodologies and design patterns

5. Implement real time argo workflow pipelines, integrate pipelines with machine learning models, and translate data and model results into business stakeholders Data Lake

6. Develop distributed Machine Learning Pipeline for training & inferencing using Argo, Spark & AKS

7. Build highly scalable backend REST APIs to collect data from Data Lake and other use-cases / scenarios.

8. Deploy Application in Azure Kubernetes Service using GitLab CICD, Jenkins, Docker, Kubectl, Helm and Mainfest

9. Experience in branching, tagging and maintaining the versions across the different environments in GitLab.

10. Review code developed by other developers and provide a feedback to ensure best practices (e.g., checking code in, accuracy, testability, and efficiency)

11. Debug/track/resolve by analyzing the sources of issues and the impact on application, network, or service operations and quality.

12. Functional, benchmark & performance testing and tuning for the built workflows.

13. Assess, design & optimize the resources capacities (e.g .Memory, GPU etc.) for Client based resource intensive workloads

What skills/technologies are required (please include the number of years of experience required)?


1. Bachelor’s/Master’s degree in Computer Science or Data Science

2. 5 to 8 years of experience in software development and with data structures/algorithms

3. 5 to 7 years of experience with programming language Python or JAVA, database languages (e.g., SQL and NO-SQL)

4. 5 years of experience in developing large-scale infrastructure, distributed systems or networks, experience with compute technologies, storage architecture

5. Strong understanding of microservices architecture and experience with building and deploying RestAPI’s using Python, Flask and Django

6. 5 years of experience with Unit and Functional test cases using PyTest, UnitTest and Mocking External Services for functional and non-functional requirements

7. Strong understanding and experience with Kubernetes for availability and scalability of the application in Azure Kubernetes Service

8. Experience in building and deploying applications with Azure, using third-party tools (e.g., Docker, Kubernetes and Terraform)

9. Experience with cloud tools like Azure and Google Cloud Platform

10. Experience with development tools, CI/CD pipelines such as GitLab CI/CD, Artifactory, Cloudbees and Jenkins

• What skills/attributes are preferred (these are a desired, not required)?

1. Python, Kubernetes, Argo Workflow, Argo Event, Hive, SQL, NO-SQL, RestAPI’s, Helm, Docker, Jenkins

• What does the interview process look like?