Unfortunately, this job posting is expired.
Don't worry, we can still help! Below, please find related information to help you with your job search.
Some similar recruitments
Ansible Automation Engineer Jobs
Recruited by Deltacubes 8 months ago Address McLean, VA, United States
Systems Engineer-Data Analysis (Sme)
Recruited by BAE Systems, Inc. 9 months ago Address Reston, VA, United States
Data Engineer Jobs
Recruited by BMK Careers 10 months ago Address Chantilly, VA, United States
Data Engineer Jobs
Recruited by Deloitte 10 months ago Address Richmond, VA, United States
Systems Engineer Jobs
Recruited by H2 Performance Consulting 10 months ago Address Norfolk, VA, United States
Network Engineer - Infrastructure
Recruited by ManTech 11 months ago Address Langley Forest, VA, United States
Data Engineer Jobs
Recruited by Narwal 11 months ago Address Reston, VA, United States
Data Engineer Jobs
Recruited by Deloitte 11 months ago Address Arlington, VA, United States
Systems Engineer Jobs
Recruited by Abile Group, Inc. 11 months ago Address Springfield, VA, United States
Data Engineer Jobs
Recruited by Amentum 11 months ago Address , Quantico, Va

Data Engineer Jobs

Company

Deloitte

Address Alexandria, VA, United States
Employment type FULL_TIME
Salary
Category IT Services and IT Consulting,Business Consulting and Services,Accounting
Expires 2023-09-04
Posted at 9 months ago
Job Description
Are you a driven problem solver looking to help our clients tackle some of the most pressing challenges within Government and Public Services (GPS)? Join Deloitte's Program Integrity practice to help government agencies protect taxpayer money. To address the threats that perpetuate fraud, waste, and abuse, our clients look to our team to provide the guidance and solutions required to help them stay ahead of emerging issues and protect the integrity of their programs. If you are looking for a rapidly growing, collaborative environment with opportunities to make an impact and grow, our Program Integrity team would be a great fit for you!


Work you'll do


  • Optimize ETL workflows and data processing to enhance performance, scalability, and reliability
  • Collaborate with clients to understand their data integration needs, including data sources, formats, and transformation requirements
  • Design, develop, and deploy scalable and efficient ETL pipelines to extract, transform, and load client data into our systems
  • Develop and maintain documentation related to client-specific ETL processes, confirgurations, and data transformations
  • Build data systems, pipelines, evaluate business needs and objectives
  • Troubleshoot and resolve issues related to data ingestion, transformation, and loading
  • Implement data quality checks and validation processes to ensure accuracy, consistency, and integrity of client data
  • Define, produce, test, review and debug solutions
  • Assist onboarding process for new clients, ensuring smooth data integration and adherence to project timelines


The team


Deloitte's Government and Public Services (GPS) practice - our people, ideas, technology and outcomes-is designed for impact. Serving federal, state, & local government clients as well as public higher education institutions, our team of over 15,000+ professionals brings fresh perspective to help clients anticipate disruption, reimagine the possible, and fulfill their mission promise.


We bring a rigorous approach to help government agencies effectively detect, prevent, and respond to issues related to fraud, waste, and abuse. Our team helps tackle these threats by bringing cutting edge analytics and AI experience with innovative mindsets. Our Program Integrity team focuses on thought diversity and collaborative problem solving to help clients address these challenges holistically, with a common goal to protect the integrity of their programs.


Qualifications


Required:


  • Experience with Apache Spark or AWS EMR
  • Strong knowledge of Python & SQL
  • GIT repositories knowledge
  • Must be able to obtain and maintain the required clearance for this role
  • Knowledge of Docker and Kubernetes
  • Bachelor's degree required
  • AWS, Azure cloud experience
  • Technical expertise with Data warehouse or Data Lake and various database design practices


Preferred:


  • Experience working with big data technologies at scale
  • BI tools such as tableau and Jaspersoft
  • Azure DevOps, JIRA or similar project tracking software
  • BigQuery or similar (redshift, snowflake, other MPP databases)
  • Data streaming such as Apache Kafka, AWS Kinesis, Spark Streaming, or similar tools
  • Experience with RDS in AWS


#RLSFY24