Unfortunately, this job posting is expired.
Don't worry, we can still help! Below, please find related information to help you with your job search.
Some similar recruitments
Data Engineer Jobs
Recruited by Intellibee 8 months ago Address , Mountain View
Data Engineer Jobs
Recruited by Ford Motor Company 8 months ago Address , Dearborn, 48120
Data Engineer Jobs
Recruited by Ford Motor Company 8 months ago Address , Dearborn, 48126
Data Engineer Jobs
Recruited by Walgreens 8 months ago Address Greater Chicago Area, United States
Data Engineer & Analyst Jobs
Recruited by Benelynk 8 months ago Address , Indianapolis, 46206, In $80,000 a year
Infrastructure Engineer Jobs
Recruited by RedRiver Systems, LLC 9 months ago Address Greater Chicago Area, United States
Associate Data Engineer Jobs
Recruited by Eli Lilly and Company 10 months ago Address Greater Indianapolis, United States
Data Engineer - Ftw
Recruited by Michelin 10 months ago Address Woodburn, IN, United States

Data Engineer Jobs

Company

Ford Motor Company

Address , Dearborn, 48126, Mi
Employment type FULL_TIME
Salary
Expires 2023-10-07
Posted at 8 months ago
Job Description

The GDIA Data Factory (DF) covers all business processes and technical components involved in ingesting a wide range of enterprise data into the GDIA Data Factory and the transformation of that data into consumable data sets and APIs in support of analytics. Data is a critical enabler to achieve our Smart Vehicles in a Smart World aspiration; it is the fuel for AI/ML, Customer Experience, Fitness, and Innovation.

The DFE (Data Factory Enablement) team will own end to end responsibility of the Data Factory Platform Infrastructure/Operations/Guidelines, which includes Project Onboarding/Infrastructure Architecture, Contributor Access process/management, support to all other teams to onboard into the GCP World and more. The DF Curation Team will provide Tools/Solutions/Guidelines for the transformation of that data into consumable data sets and APIs in support of analytics. This team will own the tools it creates with continues development and enhancements.

This position provides an unparallel opportunity to shine on a global team that is responsible for supporting some of the critical Data Factory (DF) initiatives. DFE & Curation Tools team is looking for a Data Engineer who will define, create and document frameworks to enable DF teams to build consumable Data Sets/Data Products in a reusable fashion. The ideal candidate will be a hands-on full stack data engineer with experience in designing and building CICD Pipelines for Google Cloud Platform (GCP) services like BigQuery, DataFlow, Pub/Sub, Data Fusion & DBT.


Required Qualifications:

  • 2-4 hands-on experience with Git and Tekton
  • Experience in working with Agile and Lean methodologies
  • In-depth understanding of GCP product technology and underlying architectures
  • Strong analytical and problem-solving skills
  • Minimum 5 years’ experience with SQL
  • 2-3 years of Terraform experience (IaC)
  • 3-5 years hands-on full stack data engineering experience in GCP which includes in depth experience with BigQuery, GCS, Dataflow, Pub/Sub.
  • Excellent interpersonal, written and oral communication skills
  • B.S in Information Systems, Computer Science, Computer Engineering or equivalent work experience

Desired Qualifications:

  • Being able to contribute individually or as part of a team
  • 1-2 years of experience with Looker studio
  • Being patient with teams and having the right attitude to help Data Factory teams as a platform team member
  • M.S in Information Systems, Computer Science, Computer Engineering or related fields

What you’ll receive in return :

As part of the Ford family, you’ll enjoy excellent compensation and a comprehensive benefits package that includes generous PTO, retirement, savings, and stock investment plans, incentive compensation, and much more. You’ll also experience exciting opportunities for professional and personal growth and recognition.

Candidates for positions with Ford Motor Company must be legally authorized to work in the United States. Verification of employment eligibility will be required at the time of hire. Visa sponsorship may be available for this position.

We are an Equal Opportunity Employer committed to a culturally diverse workforce. All qualified applicants will receive consideration for employment without regard to race, religion, color, age, sex, national origin, sexual orientation, gender identity, disability status, or protected veteran status.

For information on Ford's salary and benefits, please visit:

https://corporate.ford.com/content/dam/corporate/us/en-us/documents/careers/2022-benefits-and-comp-GSR-sal-plan-2.pdf

At Ford, the health and safety of our employees is our top priority. Vaccination has been proven to play a critical role in combating COVID-19. As a result, Ford has made the decision to require U.S. salaried employees to be fully vaccinated against COVID-19, unless employees require an accommodation for religious or medical reasons. Being fully vaccinated means that an individual is at least two weeks past their final dose of an authorized COVID-19 vaccine regimen. As a condition of employment, newly hired employees will be required to provide proof of their COVID-19 vaccination or an approved medical or religious exemption. (As of June 19, 2023, we are no longer requiring a Covid-19 vaccination)


Responsibilities:

  • Work closely with multiple teams (Ingestion, Classification, Curation, Access Management, Quality and User Activity Monitoring) to define, build, integrate, execute and support the GCP Data Factory Pipeline and enable 100% of Hadoop sources migrated to the GCP DF
  • Support Data Factory teams with Terraform & Containerization
  • Provide Guidelines/Documents/Best Practices for Contributor Teams
  • Support all Data Factory teams with CICD tools like Tekton, CloudBuild
  • Develop, implement and monitor complete full stack data engineering tasks to support DFE day to day operations
  • Create Frameworks to enable Contributor Teams build consumable Data Sets to increase reusability
  • Participate in technology renewal activities for DFE including participation in End-to-End Integration & Regression testing
  • Keep up to date with the latest trending technologies in Data Engineering and GCP Service offerings
  • Support any other Operations responsibilities that fall under DFE like project billing, security & Tech Debt