Unfortunately, this job posting is expired.
Don't worry, we can still help! Below, please find related information to help you with your job search.
Some similar recruitments
Azure Databricks Engineer Jobs
Recruited by TechLadderInc 9 months ago Address United States
Azure System Administrator Jobs
Recruited by GDK Services LLC 10 months ago Address United States
Business Data Analyst - Sql / Azure Databricks
Recruited by Intellectt Inc 11 months ago Address United States
Azure Administrator Jobs
Recruited by Cira Infotech 1 year ago Address United States

Azure Engineer/ Databricks Jobs

Company

Emerson United Inc

Address United States
Employment type FULL_TIME
Salary
Expires 2024-01-13
Posted at 9 months ago
Job Description

Position Overview:

The Business Intelligence Data Engineer will be responsible for expanding and optimizing our data and data pipeline architecture, support cross functional teams in generating timely insights. The ideal candidate is a data specialist experienced in designing, developing, and deploying complex data pipelines in Azure Cloud platform

The Business Intelligence Data Engineer will support our software developers, database architects, data analysts, dashboard developers and data scientists on data initiatives and will ensure optimal data delivery architecture is consistent throughout ongoing projects. The role requires solid technical skills in designing and delivering large-scale enterprise data platforms on Azure Cloud combined with very strong communication skills..

Position Responsibilities include, but are not limited to:

· Deploy new solutions and configurations to meet business and compliance requirements.

· Participate in 24x7 on call rotations.

· Discover current technical standards and best practices (R&D).

· Deploy security patches, updates, and configuration changes.

·Position Requirements:

· Work with multiple business stakeholders in defining the right data requirements to fulfill growing analytics / insights needs across the enterprise

· Create and maintain optimal data pipeline architecture

· Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, optimal cost and performance.

· Design right infrastructure / compute configuration for optimal extraction, transformation, and loading of data from a wide variety of data sources into ADLS, Databricks and Synapse.

· Develop data pipelines using PySpark, Python and DB SQL in Databricks in Lakehouse architecture

· 5+ years of experience in a Data Engineering environment with hands on experience developing ADF (Azure Data Factory) pipelines for an enterprise solution.

· 3+ years of experience in writing code in Databricks using Python to transform, manipulate (ETL/ELT) data, along with managing objects in Notebooks, Data Lake, ADLS, Azure Synapse.

· Experience with writing complex SQL Queries, User Defined Function, Stored procedures and Materialized views. Someone who comes from database development background and have transitioned to Azure Cloud/Data Lake/Synapse.

· Working experience with Azure DevOps and Source controls.

· Experience working in a large Retail enterprise and understanding of Retail based data and reporting models.

· Experience with reporting tools like PowerBI/Tableau/MicroStrategy

· Strong analytic skills related to working with different types of datasets from wide variety of data sources

· Strong project management and organizational skills

· Experience supporting and working with cross-functional teams in a dynamic environment

· Understanding of ELT and ETL patterns and when to use each.

Position Qualifications:

· Undergraduate degree required (Graduate degree preferred) in Computer Science, Statistics, Informatics, Information Systems, or another quantitative field.

· Experience using the following software/tools/services: Azure Data Factory, Azure Data Lake Storage, Azure Databricks, Azure Synapse, SQL, PySpark

· Experience with relational SQL and NoSQL databases

· Experience with data pipeline and workflow management tool