Junior Big Data Platform Engineer
By Trenchant Employee Services Limited At , Irving, 75039 $100,000 - $140,000 a year
Using advanced troubleshooting skills to diagnose and fix problems
Experience with at least one programming language
Key responsibilities of the role include:
The following technology experience is beneficial but not essential:
Work Location: Hybrid remote in Irving, TX 75039
Researching and implementing new technologies in line with key business objectives
Platform Developer Jobs
By N3bula Systems At United States
Remediate all security vulnerabilities pertaining to products managed by the team.
6+ years’ experience in Linux Operating Systems
Candidate should have experience using the following software/tools:
Experience with Amazon Web Services (AWS)
Experience with one or more scripting languages (e.g. Bash, Python, PowerShell)
Modernizing Puppet code into modules
Platform Developer Jobs
By N3bula Systems At San Diego Metropolitan Area, United States
Remediate all security vulnerabilities pertaining to products managed by the team.
6+ years’ experience in Linux Operating Systems
Candidate should have experience using the following software/tools:
Experience with Amazon Web Services (AWS)
Experience with one or more scripting languages (e.g. Bash, Python, PowerShell)
Location: San Diego, CA (Point Loma) (mostly remote)
Software Engineering Intern-Big Data Platform
By Samsung Electronics America At Mountain View, CA, United States
Preferred – Basic knowledge in Amazon Web Services, RDBMS, NoSQL Databases, big data tools and pipelines, Android development
Strong communication skills and ability to work well in a fast-paced team environment and self-motivated
Explore cutting-edge technologies for our products
Design and develop machine learning solutions for various device sensors (e.g., Wi-Fi, GPS, Cell)
Design experiments, perform evaluations, learn shortcomings and apply enhancements
Publish research results in conferences/journals and file patents
Manager, Data Science, Google Cloud Business Acceleration
By Google At Austin, TX, United States
Experience working with and developing for non-technical users (defining requirements, explaining technical concepts to non-technical business users, etc.).
Bachelor's degree or equivalent practical experience.
6 years of experience in statistical modeling, data mining, and data analysis.
Experience in data foundations: SQL, data manipulation procedural language, statistics, experimentation, and modeling.
Experience with ETL, data mining, and processing with multiple datasets using distributed computing.
Excellent verbal and written communication skills.
Software Engineer, Big Data Platform
By LiveRamp At San Francisco, CA, United States
Provide advanced data management and segmentation capabilities to our internal and external customers.
2+ years of experience writing and deploying production code
Solid programming skills in Java or Scala
Experience with Cloud providers like AWS, Azure, GCP
Bachelors or Masters in Computer Science, Information Technology, Engineering, or related field or commensurate work experience
Tackle challenging problems such as implementing SOA and Kubernetes at scale and overhauling our data tier on GCP and beyond.
Technical Curriculum Developer, Google Cloud
By Google At Austin, TX, United States
Manage roadmap of Google Cloud technologies as new products, capabilities, and services are announced and made available.
5 years of experience in developing and delivering technical training or labs.
Experience writing in Markdown, JavaScript Object Notation (JSON), and Yet Another Markup Language (YAML).
Experience scripting or coding in one of the following languages such as SQL, Javascript, Python, Hypertext Preprocessor (PHP), Go, or Ruby.
Experience with Google Cloud Platform.
3 years of experience in technical training content development.
Analytics Developer (Big Data Cloud) - Intermediat
By Bayer At , Remote
Experience with data pipeline and workflow management tools: Azkaban, Luigi, Airflow, etc.
Rapidly prototype new analytics views and work directly with stakeholders across multiple functions (Science, Marketing, Sales, Risk, Finance, Product)
5+ years of experience with Python,Scala and/or Java
5+ years of experience in architecting, designing, developing, and implementing cloud solutions on AWS/GCP cloud platforms
5+ years of experience working with relational and NoSQL datastores
2+ years of experience with Spark, Flink and/or distributed computing.
Big Data Platform Engineer
By Apple At , Cupertino, Ca
Validated software engineering experience and field in design, test, source code management, and CI/CD practices.
Problem-solving and debugging skills with experience in one or more of the following languages: Java, Python, Scala, Go, or Ruby.
Experience developing large scale distributed computing systems.
There is a lot of communication involved! Excellent interpersonal skills are highly valued.
Experience using data storage technologies such as Apache Parquet or Avro Experience in machine learning algorithms is a plus.
Experience in data modeling and developing SQL database solutions is a plus.
Cloud Big-Data Engineer Jobs
By PhasorSoft Group At , Starkville, Ms $45 an hour
Experience working with in-memory computing using R, Python, Spark, PySpark, Kafka, and Scala.
Experience in parsing and shredding XML and JSON, shell scripting, and SQL
Experience working with Hadoop ecosystem - HDFS, Hive.
Experience working with AWS ecosystem - S3, EMR, EC2, Lambda Cloud Formation, Cloud Watch, SNS/SQS.
Experience with Azure – Azure Data Factory (ADF).
Experience working with SQL and No SQL databases.

Are you an experienced Big Data Developer looking to take your career to the next level? Join our team and help us build the future of data processing on the Google Cloud Platform. We offer a competitive salary and a great work environment. Come join us and make a real impact!

Overview:

Google Cloud Platform Big Data Developer is responsible for developing and maintaining big data solutions on the Google Cloud Platform. This includes designing, developing, testing, deploying, and maintaining data pipelines, data warehouses, and data lakes. The Big Data Developer will also be responsible for developing and maintaining ETL processes, data models, and data warehouses.

Detailed Job Description:

The Google Cloud Platform Big Data Developer will be responsible for developing and maintaining big data solutions on the Google Cloud Platform. This includes designing, developing, testing, deploying, and maintaining data pipelines, data warehouses, and data lakes. The Big Data Developer will also be responsible for developing and maintaining ETL processes, data models, and data warehouses. The Big Data Developer will be responsible for designing and developing data pipelines and data warehouses that are optimized for performance and scalability. The Big Data Developer will also be responsible for developing and maintaining ETL processes, data models, and data warehouses. The Big Data Developer will be responsible for troubleshooting and resolving any issues that arise with the data pipelines, data warehouses, and data lakes.

What is Google Cloud Platform Big Data Developer Job Skills Required?

• Experience with Google Cloud Platform (GCP)
• Experience with Big Data technologies such as Hadoop, Spark, and Kafka
• Experience with data modeling and data warehousing
• Experience with ETL processes
• Experience with SQL and NoSQL databases
• Knowledge of scripting languages such as Python and Java
• Knowledge of data security and privacy
• Ability to troubleshoot and debug data pipelines

What is Google Cloud Platform Big Data Developer Job Qualifications?

• Bachelor’s degree in Computer Science, Information Technology, or related field
• 5+ years of experience in developing and maintaining big data solutions
• Experience with Google Cloud Platform (GCP)
• Experience with Big Data technologies such as Hadoop, Spark, and Kafka
• Experience with data modeling and data warehousing
• Experience with ETL processes
• Experience with SQL and NoSQL databases
• Knowledge of scripting languages such as Python and Java
• Knowledge of data security and privacy
• Ability to troubleshoot and debug data pipelines

What is Google Cloud Platform Big Data Developer Job Knowledge?

• Knowledge of Google Cloud Platform (GCP)
• Knowledge of Big Data technologies such as Hadoop, Spark, and Kafka
• Knowledge of data modeling and data warehousing
• Knowledge of ETL processes
• Knowledge of SQL and NoSQL databases
• Knowledge of scripting languages such as