Don't worry, we can still help! Below, please find related information to help you with your job search.
Data Architect Jobs
Company | Gibson Consulting |
Address | Greater Chicago Area, United States |
Employment type | FULL_TIME |
Salary | |
Category | IT Services and IT Consulting,Transportation, Logistics, Supply Chain and Storage,Business Consulting and Services |
Expires | 2023-12-28 |
Posted at | 10 months ago |
JOB DESCRIPTION
Gibson Consulting is looking for a Data Architect to join our team. The ideal candidate will have strong knowledge of SQL, Python, cloud-based pipelines, and architecture. The data architect will be responsible for designing and building data architectures, data management, analytics, and developing data pipelines. They will also be responsible for developing data models, analyzing data sets, and creating visualizations. The data architect will also be responsible for ensuring the accuracy and consistency of data and guiding the development and implementation of data policies.
The ideal candidate will have excellent problem solving and communication skills and be able to work collaboratively with a variety of high-level stakeholders. They will be able to work independently and manage their own workload.
The candidate must be comfortable working in a consistently changing and fast-paced environment. They must also be curious and want to continue advancing their skills in additional areas such as ETL/ELT, machine learning and artificial intelligence.
The position would report through the Principal Data Architect
Principal Duties and Responsibilities
Develop ETL routines for Extraction, Transformation and Loading large data into Azure SQL data warehouse utilizing tools such as Azure Data Factory, SQL, Python, and
Perform data enrichment and normalization to resolve requirement conflicts and gaps between various data sources and business entities
Sound knowledge of entity-relationship and multidimensional data models, reports and diagrams
Work within a cross functional team and drive results
Conduct trainings on new data techniques and methodologies
Document and develop ETL routines related to Fact Tables, Dimension Tables, star/snowflake schema models, slowly changing dimensions, foreign key concepts & referential integrity
Ensure data quality and data validation specifications & standards and data cleansing/data scrubbing techniques adhered to consistently
A desire to learn about advanced data analysis techniques and keep abreast with the market place
Bring creative solutions management for solving unique data issues
Travel (domestic and international) to client sites may be required (Up to 50%)
Requirements:
- Knowledge of SQL, Python, and cloud-based pipelines
- Ability to analyze and interpret data sets
- Excellent problem-solving, communication, and organizational skills
- Understanding of data governance, security, and privacy
- Bachelor’s degree or applicable work experience in Computer Science, Information Systems, Information Management, or a related field
- Strong Excel skills
- Experience with data visualization
- Ability to work on a cross-functional team and provide analytical insights
Preferences:
- Experience in cloud-based computing i.e. Azure, AWS
- ETL experience from multiple data sources
- Previous exposure to Python, C++, Java, or other programming languages
- Experience with Azure Data Factory or similar pipeline tools
-
Systems Analyst - Excel, Xml, Sql, Scripting
By CyberCoders At Salt Lake City, UT, United States 7 months ago
-
(Senior) Finance & Shared Services Manager
By Catholics For Choice At Washington, DC, United States 7 months ago
-
Paralegal - Probate Administration
By CyberCoders At Miami, FL, United States 7 months ago
-
Account Executive - Automotive Software
By ECW Search At United States 7 months ago
-
Construction Project Coordinator Jobs
By CyberCoders At River Falls, WI, United States 7 months ago