Unfortunately, this job posting is expired.
Don't worry, we can still help! Below, please find related information to help you with your job search.
Some similar recruitments
Solutions Architect - Azure
Recruited by InterEx Group 8 months ago Address Greater Chicago Area, United States
Associate Architect/Land Architect
Recruited by City of San Jose 11 months ago Address , San Jose, Ca $117,980 a year
Crm Data Architect Jobs
Recruited by Ford Motor Company 11 months ago Address , Dearborn, 48120, Mi
Director Data Management Jobs
Recruited by Laka & Company 1 year ago Address Greater Chicago Area, United States
Enterprise Data Engineer - Snowflake
Recruited by Arthur Grand Technologies Inc 1 year ago Address , Rockville, Md
Marketing Data Architect - Lev (Remote)
Recruited by Cognizant 1 year ago Address , Indianapolis, 46220, In

Data Architect Jobs

Company

Gibson Consulting

Address Greater Chicago Area, United States
Employment type FULL_TIME
Salary
Category IT Services and IT Consulting,Transportation, Logistics, Supply Chain and Storage,Business Consulting and Services
Expires 2023-12-28
Posted at 10 months ago
Job Description

JOB DESCRIPTION

Gibson Consulting is looking for a Data Architect to join our team. The ideal candidate will have strong knowledge of SQL, Python, cloud-based pipelines, and architecture. The data architect will be responsible for designing and building data architectures, data management, analytics, and developing data pipelines. They will also be responsible for developing data models, analyzing data sets, and creating visualizations. The data architect will also be responsible for ensuring the accuracy and consistency of data and guiding the development and implementation of data policies.


The ideal candidate will have excellent problem solving and communication skills and be able to work collaboratively with a variety of high-level stakeholders. They will be able to work independently and manage their own workload.


The candidate must be comfortable working in a consistently changing and fast-paced environment. They must also be curious and want to continue advancing their skills in additional areas such as ETL/ELT, machine learning and artificial intelligence.


The position would report through the Principal Data Architect


Principal Duties and Responsibilities

Develop ETL routines for Extraction, Transformation and Loading large data into Azure SQL data warehouse utilizing tools such as Azure Data Factory, SQL, Python, and


Perform data enrichment and normalization to resolve requirement conflicts and gaps between various data sources and business entities


Sound knowledge of entity-relationship and multidimensional data models, reports and diagrams


Work within a cross functional team and drive results


Conduct trainings on new data techniques and methodologies


Document and develop ETL routines related to Fact Tables, Dimension Tables, star/snowflake schema models, slowly changing dimensions, foreign key concepts & referential integrity


Ensure data quality and data validation specifications & standards and data cleansing/data scrubbing techniques adhered to consistently


A desire to learn about advanced data analysis techniques and keep abreast with the market place


Bring creative solutions management for solving unique data issues


Travel (domestic and international) to client sites may be required (Up to 50%)

Requirements:

  • Knowledge of SQL, Python, and cloud-based pipelines
  • Ability to analyze and interpret data sets
  • Excellent problem-solving, communication, and organizational skills
  • Understanding of data governance, security, and privacy
  • Bachelor’s degree or applicable work experience in Computer Science, Information Systems, Information Management, or a related field
  • Strong Excel skills
  • Experience with data visualization
  • Ability to work on a cross-functional team and provide analytical insights

Preferences:

  • Experience in cloud-based computing i.e. Azure, AWS
  • ETL experience from multiple data sources
  • Previous exposure to Python, C++, Java, or other programming languages
  • Experience with Azure Data Factory or similar pipeline tools